Post tag: Sir Timothy John “Tim” Berners-Lee
A Guide to “Zoomiquette”

Zoom LogoMarlowsphere (Blog #152)

In the early days of the World Wide Web (introduced to the planet in 1989 by Sir Timothy Berners-Lee) its ability to allow people all over the planet to e-mail each other 24/7 quickly became a characteristic of the communications landscape. And as with all new technologies, “standards” for the use of the Internet, à la email, soon gave way to certain “rules” of use. It became known as “Netiquette.”

Today, the new technological “thing” to do is “to Zoom.” The advent of COVID-19 has greatly accelerated the use of the platform (it was officially released on January 25, 2013). Its stock is now resting somewhere in the range of $250/share. No wonder. It has 200 million users daily.

It also has competitors: Amazon Chime, GoToMeeting, Google Hangouts Meet, BlueJeans, Microsoft Teams, Cisco WebEx, and Signal. But like the “Kleenex” brand, “Zooming” has become the generic term for using the Internet for “electronic gatherings.”

Although not the first electronic technology to allow aural and visual communications among people, with this platform’s explosive growth has come the need for certain standards of use. Why? Zooming is really broadcast and cable television in an organizational and personal context. These above mentioned electronic media have evolved visual and aural “standards” over many decades. People have certain expectations as to how television should look and sound. It makes for more effective communication.

So, below I offer a few tips to the emerging “Zoomiquette” so one can maximize his/her appearance to have the best communication experience possible.

Avoid the guillotined look1. Avoid the “guillotined” look. How many times have you observed a participant on a Zoom call whose head is just above the bottom part of the screen looking like they just became another victim of the French Revolution? The guillotined look is not very becoming.

Solution: The video production standard is to divide the screen into three imaginary sections: a top third, a middle third, and a bottom third. A person’s eyes should be on the bottom line that defines the top third of the screen. This way the other participants will see that the head is actually connected to a body, at least the neck and shoulders. You can accomplish this by simply adjusting the angle of the screen of the laptop.

 

2. Prevent the “heavenly lighting” look. This is a classic error of even people who are in the media production business. The Prevent "Heavenly Lighting Look"problem occurs when someone on a Zoom call has a bright light behind them instead of in front of them or even to the side. This is seen often when someone sets up their laptop with day light streaming through a window behind them. The cameras installed in laptops respond to the brightest spot in the picture. With this setup the camera looks at the bright light and says “OK. I’ll adjust everything to that bright light!” The result is the person’s face goes dark. They might look like they have a halo around their head, but no one can see their face.

Solution: Avoid having a window or a bright light behind you. Position the laptop so that the window or bright light is in front of you.

 

I just love the paint on your ceiling3. “I just love the paint on your ceiling.” Here’s another pitfall. People place their laptop on a table that’s below their own line of sight (in a few instances it’s above their line of sight and everyone gets a shot of someone’s floor). As a result, the laptop camera is looking up at the participant and we also get a glimpse of the person’s ceiling.

Solution: Position the laptop screen so that it’s level with your eyes.

 

4. “If I knew you were coming, I would have straightened up a bit.” The COVID-19 pandemic has forced a lot of people to hunker down at home. Now Zoom has come into the home. In effect, your home has become a television studio, or at least a “If I knew you were coming, I would have straightened up a bit.”video location. Find a location for your Zoom meetings that present you in a semi-professional setting. I’m sure you’ve noticed that many reporters, experts, and pundits have chosen a place in their home that looks like a library, i.e., there are books in the background.

Solution: avoid setting up your location against a blank wall. Chose a place where’s there’s something of interest in the background. Often it’s a bookcase or something that reflects who you are. Be aware not to have anything distracting, or too personal, like family photos, or anything offensive in the background.

 

Just Say No to Vertical Video5. “Just say no to vertical video.” A minority of “Zoomers” use their phone to access the Zoom platform, but in-so-doing they position their phone in the same way they use it for other purposes: they hold it vertically. The result is a column visual effect, rather than the horizontal picture characteristic of laptop users. By holding the smartphone vertically the user creates blank bookends on both sides of the vertical picture in the middle.

Solution: Turn the smartphone 90 degrees to ether the left or the right. Tada! Now you have a horizontal picture with no black on any sides.

 

6. “Are you experiencing an earthquake?”  Often when using the smartphone to Zoom the user holds it in his/her hands or Are you experiencing an earthquake?walks around with it causing the picture to be in constant motion. This is very disconcerting for other viewers on the meeting with concentration and focus.

Solution: Before the meeting find a place to sit comfortably. Then find something to prop the phone up against such as a couple of books or take a look at these 10 DIY smartphone stand ideas.  If none of these work for you invest in one of the many smartphone stands on the market.

 

7. “There’s an echo, echo, echo. . .”There’s an echo, echo, echo. . .” Why do some participants sound like they’re in an echo chamber? The location you choose for your Zoom meeting is not only about how you look, it’s also about how you sound. A room that has things on the wall or behind you will allow your voice to sound more round and warm. An empty room will have you sound like you’re in an echo chamber.

Solution: choose your Zoom location carefully for both how you look and sound.  Choose a room that has carpeting and/or substantial items in it to avoid the echo effect. If necessary, add items to the room that you use. Be sure to place them in “off-camera” areas of the room.

 

8. “Can you hear me now?” We’ve all experienced someone’s audio level getting louder and softer as they speak. Participating in a Zoom meeting doesn’t have the same 8 Can you hear me nowaural effect as sitting around a conference table with colleagues in the same room. If you keep leaning back and forth or from side to side, this changes the distance between your voice and the microphone in the top part of your laptop screen.

Solution: Keep yourself pretty much in one position near the microphone when you’re talking during a Zoom call.

 

9.“Do you have to touch your face all the time?” Participating in a Zoom call reverses the usual speaker audience environment. Not only are you looking at the audience looking at you, but you are also looking at yourself. Presuming everyone Do you have to touch your face all the time?keeps their “video” option on, you’re also looking at how people scratch their nose, fidget, close their eyes, yawn, and do dozens of other non-verbal things people do when not talking (or even when talking). It’s a sociology exercise in observing how people behave non-verbally.

Solution: Be actively aware of your non-verbal behavior and to how others are reacting to what you say.

 

The reality is this: a Zoom meeting is not merely “the next best thing to being there.” A Zoom meeting is not just a formal (or in some cases an informal meeting) via electronic means. Inherently, a Zoom meeting is a video event and as such should be approached with a sense of “standards” if it is to be a fruitful meeting. Remember, Zoom meetings can be recorded by the meeting host, even a family meeting.

Sure, Zoom has enabled millions of people to communicate with each other all over the world where there is Internet access. Keep in mind, however, perceptions to the contrary, that only about 54% of the people on the planet have Internet access. So, in a very meaningful way, Zoom allows a little more than half of the world’s populace to stay in touch in the age of COVID.

This aside, “Zooming” will continue as an effective communications medium for the foreseeable future. Moreover, the concept is not new. Teleconferencing and tele-commuting has been around for at least 40 years, but now that laptops with built-in cameras and microphones are as ubiquitous as cellphones and smartphones, this new environment creates a demand for proper use. Practice Zoomiquette whenever the opportunity arises and you’ll have a better communications experience.

Eugene Marlow, Ph.D., MBA © 2020

Back to Top


How Old is Multimedia?

Sociologist William Fielding OgburnThe Marlowsphere (Blog #149)

It is a truism that laws more often than not lag behind cultural customs especially in times of change, to which we could add in times of rapid technological change. Sociologist William Fielding Ogburn (1896-1959) posited in 1922, for example, the difference between “material culture” and “adjustment culture.” The former refers to technology, the latter to the often lagging response to technological change on the part of members of a culture and its cultural institutions. In other words, technology is the primary engine of progress, i.e., change, and it takes time for people and institutions to catch up to the changes and characteristics new technology brings, especially when it comes to terms and definitions.

In academia adjustments to technological change with respect to programs, courses, and especially terms are more often than not “behind the curve,” never in front of it. Often, there is a tendency to grab on to a new technology well after it has been embraced by early adopters and to describe “new” courses with terms gleaned from the popular media without much aforethought.

I have observed this from direct experience.

In 1988 the Journalism Program at Baruch College (City University of New York) invited me to create and teach courses in video field production and radio news. I was the first professor in the program with a print and electronic media background based on my recently acquired Ph.D. and experience in video and radio production.

Sir Tim Berners-Lee Inventor of the World Wide WebIn reality, I was hired because the Speech Department (now the Department of Communication Studies) had initiated a course in “Corporate Video” and the Director of the Journalism Program (then a part of the Department of English) didn’t want to be outdone! In other words, inter-departmental competition motivated my being hired. Mind you, this was 1988, a year before (now Sir) Tim Berners-Lee introduced the World Wide Web which, in turn, began the slow but inexorable demise of print journalism. So, in some small measure, the then director of the journalism program (a full professor with a Ph.D. in English Literature, now retired) can be forgiven for not having a crystal ball to peek into the future.

It was not until 2007 (19 years after my hiring) that the department hired a second professor with expertise in electronic journalism. Her specialty was “multimedia journalism.” And it was only until 2016 and 2017 that a third and fourth professor with print and electronic journalism credentials were hired. The latest addition to the faculty has deep experience in podcasting. That’s four professors out of 11 full-time professors in 29 years, even though in this same period the world of journalism had moved inexorably to a greater reliance on visualization (video) and orality (podcasting) via the computer.

There was progress, however. The (now) Department of Journalism introduced a course in “Advanced Multimedia Journalism” following the establishment of a course in “Multimedia Journalism” which I also taught. There’s now two courses in podcasting.

A couple of years ago we were in the throes of a self-review in response to periodic accreditation requirements. One of the department’s “learning goals” (originally formulated in 2013) dealt with “multimedia.” My reaction to reading this learning goal was to immediately feel how out of date and mis-defined it seemed. It had been articulated in 2013 by a professor with no “electronic journalism” experience to speak of. This prompted me to look into the technical definition and history of the term “multimedia.” My search taught me again that all things have antecedents and confirmed that academia is usually behind the curve.

I discovered the “concept” and “term” multimedia is about 60 years old! Yes, it’s that old and it predates the advent of the personal computer. It’s also another example of what’s old is new again.

Bob Goldstein, SingerAccording to several sources, the term multimedia was coined by singer and artist Bob Goldstein (later ‘Bobb Goldsteinn’) to promote the July 1966 opening of his “LightWorks at L’Oursin” show at Southampton, Long Island. Goldstein was perhaps aware of an American artist named Dick Higgins, who had in 1964 discussed a new approach to art-making he called “intermedia.”

A month later, on August 10, 1966, Richard Albarino of Variety borrowed the terminology, reporting: “Brainchild of songscribe-comic Bob (‘Washington Square’) Goldstein, the ‘Lightworks’ is the latest multi-media music-cum-visuals to debut as discothèque fare.”

But wait! There’s more. Two more years later, in 1968, the term “multimedia” was re-appropriated to describe the work of a political consultant, David Sawyer, the husband of Iris Sawyer—one of Goldstein’s producers at L’Oursin.

The original meaning of “multimedia” kept evolving. In my 1995 book Winners! Producing Effective Electronic Media (Wadsworth Publishing Company) co-authored with "Winners! Producing Effective Electronic Media" by Eugene Marlow & Janice SileoResearch Associate Janice Sileo, in a chapter entitled “Multimedia” we wrote, “The Microsoft Corporation, in a February 1993 Backgrounder, defined computer-based ‘multimedia’ as ‘the integration of text, graphics, audio, video and other types of information. . . .’.” Further, “Clearly, multimedia has evolved from an integration of various digital, electronic, aural, and visual technologies into an interactive medium for use in the home and the office.” Sound familiar? 1993 is 29 years “after” the term was originally coined. Yet some journalism educators use the term and define “multimedia journalism” as if it were invented just a few years ago!

Clearly, the term “multimedia” has been bandied about and used by journalists and professors of journalism who have no concept of its origin or layered meanings. Further, the term “multimedia journalism” is likewise mis-construed. It should be “computer-based journalism” or “digital journalism. “ If used even more correctly, “multimedia” would also refer to film, broadcast and cable television. After all, these communication media combine sound with pictures and graphics and text of all kinds. This is an example of a more recent generation of professionals ignoring the fact that there are always antecedents.

But to ask these folks to appreciate the abovementioned distinctions might be too much. They perceive they’re in the technological vanguard and don’t want to be disturbed in their academic bubble. They haven’t done their homework. They’re in the caboose of a technological train—with a longer history than realized—whose engine is ahead of them.

©Eugene Marlow, Ph.D. 2020

Back to Top


The Global Village At War

The Global VillageThe Marlowsphere Blog (#106)

My last blog posited that we are moving in the direction of humanizing technology. That is, increasingly our communications systems in particular—the Internet, World Wide Web, telecommunications—are morphing into very human-like sensory characteristics, for example, increasing voice characteristics, increasing use of motion video on the Internet. In homes, businesses, and public spaces video screens have become flat (some are curved), and large.

I also posited that our communications systems have created a global marketplace for information and communications. We have inexorably become what Dr. Marshall McLuhan   described in1964 as a “global village.” His use of those two words is correct. The planet has become global with respect to all manner of human activities. In one sense we have moved from villages, to city-states, to countries, and in the last century to regional economic entities, e.g., the Pacific Rim, the European Union, OPEC, the Organization of American States, the BRICS, and so on.

But the phrase also reflects the characteristics of a village of the past: people in the village are an entity unto themselves, sometimes dealing with outsiders, sometimes not, where men are dominant, and women are secondary. In today’s world the concept of a “village” is reflected in the fact that most if not all the new countries that have been admitted to the United Nations are based on “ethnic centers,” never mind geographic considerations. Further, current conflicts in the world are ethnically-based—that is, there are groups at war with each other based on perceived cultural differences. It is not about territory, especially, or even economic gain. It’s an extremist view—“we perceive who we are and if you’re not one of us, you must be killed.”

Technology used for good or evilIt is ironic that for all the technological advances in communications and the collaboration it fosters, there seems to be more groups coming into existence who are insular, exclusive and extremist.

Examples abound. The current Israel-Palestinian conflict in Gaza is about Hamas wanting to destroy Israel; it refuses to recognize Israel as a legitimate state.  On the other hand, Israel exacerbates the perceptions by continuing to build settlements on land the Palestinians regard as theirs. They also perceive that the entire region is theirs, but the United Nations   in 1948 voted otherwise. There has been conflict ever since. The conflict is exacerbated by Iran that is apparently supplying rockets to Hamas. Iran seems also bent on finding a way to destroy Israel.

In Iraq ISIS believes its Sharia interpretation of Islam is the only interpretation of the Koran and in that context feels no constraint in executing people who do not believe as they do and blowing things up—such as the resting place of the biblical Joseph.

In Syria newly elected (for the third time) President Assad thinks nothing of killing his own people who disagree with his self-centered, elitist policies. In Egypt the former, now deposed President Morsi thought nothing of inculcating his Brotherhood of Islam’s narrow-focused perception of the world. An ex-military general is now the new president. We’ll see how that works out.

Abubakar Shekau Boko HaramIn Africa, the Boko Haram, led by a man who was once interred in a mental institution, believe that everything in the west is bad. It thinks nothing of killing and kidnapping in the name of Islam.

In eastern Ukraine recently, Russian President Putin thought nothing of seizing the Crimean Peninsula and annexing it to Russia. He also supports the “Ukrainian militants” who similar to the abovementioned extremists think nothing of seizing buildings and killing those who do not feel the way they do about Russia.

In Chechnya rebels there continue to blow things up—including people—because they want a separate state based on their cultural heritage. The war in Bosnia at the end of the last century was also ethnically-based. Many died because each side saw their cultural heritage as the culture to follow. Ethnic cleansing followed.

Adolf Hitler used his Nazi propaganda machine to manipulate much of the German populace into perceiving that the Jews were at the root of their problems and that Jews were less than human. Starting with Kristallnacht (The Night of Broken Glass) in November 1938, the Third Reich succeeded in exterminating six million Jews.

In China towards the end of Mao Zedong’s rule, the Cultural Revolution purged a generation of intellectuals and artists who were perceived as out of step with Mao’s dictates. Millions perished.

The CrusadesIn the 16th century the Spanish Armada attacked England. Why? Because Spain was Catholic and Elizabeth I wasn’t.  The Armada famously perished in the English Channel from a storm that came up just at the right moment.

Go back a millennia and you have the Crusades that pitted the righteousness of Christians against the righteousness of Muslim-based cultures. Go back another thousand years and you have the righteousness of the Roman Empire against the emergent righteous Christians, and so on.

Conclusion: toleration of other people’s cultural values and views is not a prominent human characteristic. To bring us back to the present, former Secretary of State Madelaine Albright recently said: “In a sentence, the world is a mess.” The problem in the early part of the 21st century is that the world means the world. Two thousand years ago the world was much smaller, but larger than the world of early agricultural communities of 10-12,000 years ago, and certainly much larger than the villages of pre-literate tribes before that.

The problem is conflicting cultural ethnic values and views are now supported and accelerated by contemporary transportation, information, and communications technologies. Information (whether factual or not) moves at the speed of light and this only expands its impact.

So, why hasn’t contemporary transportation, information, and communication technologies made the world a better place for all of us to enjoy? Why haven’t these technologies made us different? Why are human beings bent on pursuing conflict rather than peaceful co-existence?

Perhaps part of the answer is that human evolution takes a lot longer than technological development. After all, it took a couple of million years for homo sapiens to develop the Living on Mars by 2033capacity for language, but only a few thousand years to move from an agricultural world, to a world with accounting, then writing, then printing, then electronics, and now photonic and nano technologies.

We’ll probably have men and women living on planet Mars sooner than men and women will learn that cooperation and collaboration is more fruitful than armed conflict.

Please write to me at meiienterprises@aol.com if you have any comments on this or any other of my blogs.

Eugene Marlow, Ph.D.
September 1, 2014

© Eugene Marlow 2014

Back to Top


A Technologically Human Future

android-germinoids with their creatorsThe Internet and the World Wide Web are confirming examples that homo sapiens, as a tool-making species, is slowly but inexorably in the process of replicating themselves—that is, externalizing their senses to the point–perhaps some two hundred years down the road–where the highly sophisticated Star Trek, The Next Generation character D.A.T.A. may actually become a reality.

Moreover, every step man takes technologically he further extends his senses—into the microcosmic, across the planet, into space. In effect, over the eons we have taken quantum leaps from the non-literate old man sitting around the campfire telling the stories of the tribe face-to-face with the members of the tribe, to the current manifestation of technology where the tribe is global (Marshal McLuhan’s so-called “global village”) and the community of man is global.

Yet, we are going in two directions simultaneously. In the face of global communications, time and time again marketing, advertising and organizational communications subject experts I have interviewed talk about one-on-one marketing, one-on-one customer contact, a one-on-one relationship with an individual anywhere on the planet with access to a computer, a modem, and an Internet hookup. This expression of the Internet experience reflects the steps in technology we have taken which bring us that much closer to the full externalization of human characteristics and behavior. Everything about the Internet and the World Wide Web is taking us further in the direction of a replication of the human, face-to-face experience.

Our current electronic/photonic technologies began with the development of the first speed of light technology—the telegraph—in 1838. Since then we have been moving towards the Digital Communityexternalization of our senses on a global scale. The Internet and the World Wide Web are a confluence of previous electronic technologies: the telephone, television, telecommunications, video, audio, radio, text, computer graphics, film, computers, and micro-processors. But the Internet and the World Wide Web are not de-humanizing media. On the contrary, they enhance humanity’s ability to communicate. We are in the process of creating new communities around the world on a macrocosmic scale and on a microcosmic scale in regional and local communities.

Internet developments support the view that this technology is pushing the “humanization” of media. In other words, the Internet will increasingly act like a person. For example, the Internet will take on more of a “voice,” such as “You’ve got mail.” A study several years ago by Killen & Associates, Internet Voice: Opportunities and Threats forecast that global voice/Internet services revenues will top $63 billion by the year 2002 from $741 million in 1997. Further, approximately 48% of the 2002 revenues will be generated in North America while 33% will come from Europe. The bad news for carriers is that the revenues generated by voice/Internet traffic will mostly supplant old telephone service revenues, according to the study. Today, increasing numbers of households have given up the old landline technology in favor of the moble phone.

From the user’s perspective, the Net will become more alive, more interactive, more conversational, more personal, more individualized, more human-like.

Many technological antecedents to the humanization continuum are already in place: robotics, miniaturization, mechanical hearts, man-made materials to replace bones and skin, limb prosthetics, faster, smaller, smarter computers.

Take a look at your computer. If you have stereo speakers, imagine the speakers as ears, the computer monitor a face, the keyboard a mouth, the CPU a brain (although in this case it is separate from the face). Imagine, then, that this crude head is mounted on top of a highly sophisticated robotic device with the ability to move.

Computer = BrainTechnological mobility is an eventuality. Two central factors pervade the adoption of a new technology: standardization and mobility. Examples abound. When the Greeks standardized the alphabet it also became de facto mobile. Same is true of printing, photography, radio, television, video, telecommunications. All these technologies are mobile. And now—with laptops and smartphones of all types—so are computers. A laptop is just several steps away from evolving into a mobile, robotic brain.

In 1997 researchers at the Department of Energy’s Oak Ridge National Laboratory (ORNL), in Oak Ridge, Tennessee developed a half living, half silicon chip dubbed “Critters on a Chip.” This integrated circuit consists of living sensors—such as bioluminescent bacteria placed on a standard integrated circuit, or chip. Officials at ORNL have boasted that the chip is small, inexpensive and provides information quickly: “This new development using an integrated chip‑based approach with living organisms could dramatically advance the ability to sense a variety of chemical agents in the environment, such as chemical warfare agents or other toxic substances and things like environmental estrogens that could have detrimental effects on living systems. . . .If it is indeed possible to manufacture these part-electronic and part-biological systems as a small, inexpensive chip, it would dramatically improve the ability to monitor many different types of environments.” (from Michael McPherson, Editor & Publisher, SCCM e‑zine, Social & Charitable Cause Marketing, sccm@netrax.net, PRFORUM LISTSERV, April 18, 1997).

It is possible. Materials scientist John Rogers and his team at the University of Illinois at Urbana-Champaign have developed materials for use inside the human body that are called “tran-science.” This is technology that is born to die, such as sensors that track blood pressure in the aorta after heart surgery, then dissolve once a patient is out of the woods. They have also made eyeball-shaped cameras that mimic human and insect sight, and soft threads of tiny LEDs that can be injected right into the brain. (See Smithsonian Magazine, December 2013).

In parallel to these technological developments, the Internet and other electronic media are actually fostering more human, face-to-face contact. These media do not separate people. Futurist John NaisbittThey bring them together. If this were not so, why then has travel around the world expanded? It is not just growth in business travel, it is also people visiting other people on a global scale. I concur with John Naisbitt’s “High tech, high touch” postulate—that as we increase our level of high technology, there is a similar response in the human realm; that as we use more high technology to communicate, as human beings we develop an increased need to “press the flesh,” even if that flesh is at a distance.

In Global Paradox (Avon Books, New York, 1994) Naisbitt made the case that travel is one of the world’s biggest industries, not energy, manufacturing, electronics, or agriculture. As of a couple of years ago, tourism and travel employed 98 million+ people.

As we continue to expand our ability to reach individuals and mass audiences at great distances, we are also becoming more consumed with local and regional issues, sometimes to the detriment of national and global concerns. Simultaneously, there has been an explosion of human interaction on a global scale, as the trends in travel indicate. There are also more arts and sports events. And observe the greater interest in special events, conferences, and conventions. While some social critics might observe that gathering at a sports event is not socializing, it does indicate man’s drive to commune in some form.

This gathering of tribes on a global scale has caused other ramifications as well as clashes. First, English has become the de facto English Dictionarylanguage of business in the world. In the last several dacades alone, approximately half of the world’s languages have simply vanished. Meanwhile, English becomes more and more the global language of commerce, not just because the United States remains one of the world’s economic superpowers, but because, as the most hybrid of all major languages, English has the greatest capacity to absorb and fuse the useful remnants of dying tongues. Thus, if you’re going into any kind of business, your strategic use of the English language—both written and oral—needs to be very strong.

"Beyond Culture" Edward T. HallSecond, very generally speaking, according to Edward T. Hall, author of Beyond Culture (New York: Anchor Publishing, 1976) there are two kinds of cultures in the world with variations in between: low context and high context. High context cultures are more sensitive to the surrounding circumstances or context of an event. This is apparent in communication in which non-verbal cues play a significant role in the interaction. Although no culture exists exclusively at either end of the context scale, some cultures, such as the Asian, Hispanic, and African-American are high-context; others, such as Northern European and American, are low-context. When east meets west, the communication can be a challenge, whether on foreign or domestic soil.

For professionals in all fields the evolving media environment requires both media studies and human studies. It is qualitative and quantitative. It requires strong written and oral communications skills. It requires technological skills and people skills. It requires a broad view of the world, but with an eye on the details in one’s own backyard.

Please write to me at meiienterprises@aol.com if you have any comments on this or any other of my blogs.

Eugene Marlow, Ph.D.
August 11, 2014

© Eugene Marlow 2014

Back to Top


The Rich, the Poor, and the Internet

"The 85 Richest People in the World Have as Much Wealth as the 3.5 Billion Poorest"The Marlowsphere Blog (#97)

There’s been a lot of press recently about the disparity between the rich and the poor on the planet. To wit, the headline reads, “The richest 85 people in the world have as much wealth as the bottom 3.5 billion on the planet.”  

The gap between rich and poor story is not new. History relates there has always been a disparity between those at the top of the wealth pyramid and those at the bottom, especially since the development and evolution of writing. In tribal times, where orality was the dominant form of human communication, there is the appearance of more equal distribution of perceived assets, except of course, when it came to the tribal chief whose status demanded greater access to so-called “wealth,” perhaps in the form of more wives or livestock than everyone else.

Regardless, wealth disparity on this planet has been an economic fact of life for thousands of years. But now there seems to be a change. In a recent Q&A on AOL.com, the authors posit:

Q. Hasn’t there always been a wide gulf between the richest people and the poorest?

A. Yes. What’s new is the widening gap between the wealthiest and everyone else. Three decades ago, Americans’ income tended to grow at roughly similar rates, no matter how much you made. But since roughly 1980, income has grown most for the top earners. For the poorest 20 percent of families, it’s dropped. Incomes for the highest-earning 1 percent of Americans soared 31 percent from 2009 through 2012, after adjusting for inflation, according to data compiled by Emmanuel Saez, an economist at University of California, Berkeley. For the rest of us, it inched up an average of 0.4 percent. In 17 of 22 developed countries, income disparity widened in the past two decades, according to the Organization for Economic Cooperation and Development.

Enter the Internet

Sir Timothy Berners-LeeWhen we talk about the Internet we must be careful to define our terms. The Internet of 1968 as developed by DARPANET (the Department of Defense) is quite a different Internet from the one that followed Sir Timothy Berners-Lee invention of the World Wide Web in 1989. The World Wide Web allowed for the integration of text, graphics, motion film and video, and sound. Many, including myself, envisioned that the Internet of the 1990s would provide a level playing field between the large mercantile organizations and startups and small businesses.

In a very real way, the technological evolution of the Internet and the ability to access software programs and hardware networks (for a modest fee) allowed non-established and non-traditional entrepreneurs to get into the money-making game without the need for large sums of start-up capital. The result is 634 million web sites as of December 2012 according to Yahoo.com. Now anyone on the planet with access to the right software, hardware, and telecommunications network, can set up a web site and be in business.

Yet what has happened since the advent of the Internet as we understand it, post 1989? Media guru Dr. Marshall McLuhan,  who wrote Understanding Media: The Extensions Marshall McLuhanof Man (McGraw-Hill 1964), would not have been surprised to learn that several professions, especially those requiring creative talent and expertise have been devastated by the advent of the Internet, and as a result have made it extremely difficult for those in these professions to make a living, putting them behind the proverbial financial eight-ball. These professions are: musicians, journalists, and photographers.

When a new technology comes along—such as the Internet—that provides characteristics that heretofore did not exist, or if they did exist weren’t very effective or efficient, that new technology tends to push the older technology out of the way, or at best, the older technology finds a new purpose in life. Two examples: When the telegraph was installed in the western states of the United States, the Pony Express vanished almost overnight. When the Internet came along, while it did not happen overnight, classified advertising pages in virtually every newspaper shrank to almost nothing. It is just easier and cheaper to advertise online than it is to advertise in print.

The Internet has devastated the music, journalism and photography professions. The ability to file share, inaugurated by Netscape, made the principle of intellectual copyright almost obsolete overnight. For musicians, especially, the value of one’s music monetarily speaking has been greatly diminished. CDs sales and radio play used to garner significant income—no longer. Physical CD sales have given way to streaming and downloading. What was once thousands of dollars has devolved into pennies. With the Internet, journalists, cum bloggers, can now create their own publishing space, resulting, in part, in too many would-be journalists saying too many things about everything devolving the value of experienced, professional reporting. Photographers are in the same boat as musicians. Their visual work, if online, is now prey to anyone who can download it. The same is true of written work of any kind if it is online. The potential for plagiarism is rampant. The value of the output of musicians, journalists, and photographers has been diminished in one technological stroke.

Yes, the Internet has leveled the playing field. Perhaps it is more accurate to say the Internet has leveled the players on the playing field. Further, the Internet has provided yet another opportunity for the savvy to accumulate wealth, contributing, perhaps greatly, to the growing income disparity referenced earlier.

Siren Servers
 
Jaron Lanier author "Who Owns the Future?"Jaron Lanier in his 2013 book Who Owns The Future? (Simon & Schuster) makes the point that the Internet, rather than providing opportunity for all, has actually created a technological context of “winner take all.” Sound familiar? It is a parallel observation to the fact that in the last thirty years or so fewer and fewer people have accumulated more and more of the planet’s wealth.

Here are a few quotes from his book:

At the height of its power, the photography company Kodak employed more than 140,000 people and was worth $28 billion. They even invented the digital camera. But today Kodak is bankrupt, and the new face of digital photography is Instagram. When Instagram was sold to Facebook for a billion dollars in 2012, it employed only thirteen people. (p. 2)

Ordinary people will be unvalued by the new [information] economy, while those closest to the top computers will become hypervaluable. (p. 15)

There’s an old cliché that goes, “If you want to make money in gambling, own a casino.” The new version is “If you want to make money on a network, own the most meta server.” If you own the fastest computers with the most access to everyone’s information, you can just search for money and it will appear. (p. 31)

These quotes are just the tip of the iceberg in a volume full of valid observations. Lanier should know. He is a computer scientist and musician, best known for his work in virtual reality research. Time Magazine named him as one of the “Time 100” in 2010. Lanier and his friends co-created start-ups that are now part of Oracle, Adobe, and Google.

One of his most telling observations is about “siren servers” (or “head end” to use a cable term) and content. And it relates to very recent articles discussing the value of news and information “aggregators”—i.e., web sites that draw content from other sources and “aggregate” them in one Siren Serversplace. And this is the point: some of the most successful web sites on the planet do not create one wit of content; they merely provide an effective and efficient platform for others to provide content that can be shared with other people FOR A FEE, either from the users or from advertisers, or both!

It should not take much thinking to identify these web sites: Facebook, Paypal, Google, Yahoo, e-bay, Twitter, Linked In, as well as the myriad number of crowd funding and fundraising sites, et al. None of these web sites creates one wit of content, yet millions have flocked to these sites with personal and professional information, or product and services for exchange. Regardless of the outcome of the exchange, the web site makes money—lots of it. And those closest to the “siren server” benefit the most.

It cannot be mere coincidence that since the advent of 24/7 cable news in 1980 followed swiftly by interactive cable, then the world wide web in 1989 (the same year as Tiananmen Square), and, in turn, the exponential growth of the Internet that is still in motion to this day, that there has been a staggering, growing disparity between the rich and poor on this planet. Even last week President Obama pointed out that while those at the very top of the economic pyramid have experienced growth in their collective net worths, the middle class has become economically stagnant, and the lower classes have fallen deeper into poverty.

Internet values information over peopleThe Internet has not leveled the playing field. It has diminished the value of people and elevated the value of information. It has stunted the mobility of the middle class and pushed the lower classes deeper into an economic black hole. These are the facts. The Internet is the new kid of the technological block and it should come as no surprise that it has created major economic shifts that will play out for some time. When the printing press came along in mid-15th century Europe, enabling the creation of Bibles in the vernacular, it ultimately disrupted the long-held power of the Roman Catholic Church, an effect that is playing out today. There are fewer people attending church services on a regular basis and fewer men and women are seeking the church as a calling.

The Internet’s force is as powerful as the crushing power of the mile thick sheets of ice during the last ice age. It is inexorable and no government regulation or censorship will stop its inevitable effect. It is much too late to attempt to put the Internet genie back in the bottle. From here on out, the mission is to find ways to use this technology for the larger economic benefit of the population.

Please write to me at meiienterprises@aol.com if you have any comments on this or any other of my blogs.

Eugene Marlow, Ph.D.
February 3, 2014

© Eugene Marlow 2014

Back to Top