Post tag: World Wide Web
A Guide to “Zoomiquette”

Zoom LogoMarlowsphere (Blog #152)

In the early days of the World Wide Web (introduced to the planet in 1989 by Sir Timothy Berners-Lee) its ability to allow people all over the planet to e-mail each other 24/7 quickly became a characteristic of the communications landscape. And as with all new technologies, “standards” for the use of the Internet, à la email, soon gave way to certain “rules” of use. It became known as “Netiquette.”

Today, the new technological “thing” to do is “to Zoom.” The advent of COVID-19 has greatly accelerated the use of the platform (it was officially released on January 25, 2013). Its stock is now resting somewhere in the range of $250/share. No wonder. It has 200 million users daily.

It also has competitors: Amazon Chime, GoToMeeting, Google Hangouts Meet, BlueJeans, Microsoft Teams, Cisco WebEx, and Signal. But like the “Kleenex” brand, “Zooming” has become the generic term for using the Internet for “electronic gatherings.”

Although not the first electronic technology to allow aural and visual communications among people, with this platform’s explosive growth has come the need for certain standards of use. Why? Zooming is really broadcast and cable television in an organizational and personal context. These above mentioned electronic media have evolved visual and aural “standards” over many decades. People have certain expectations as to how television should look and sound. It makes for more effective communication.

So, below I offer a few tips to the emerging “Zoomiquette” so one can maximize his/her appearance to have the best communication experience possible.

Avoid the guillotined look1. Avoid the “guillotined” look. How many times have you observed a participant on a Zoom call whose head is just above the bottom part of the screen looking like they just became another victim of the French Revolution? The guillotined look is not very becoming.

Solution: The video production standard is to divide the screen into three imaginary sections: a top third, a middle third, and a bottom third. A person’s eyes should be on the bottom line that defines the top third of the screen. This way the other participants will see that the head is actually connected to a body, at least the neck and shoulders. You can accomplish this by simply adjusting the angle of the screen of the laptop.

 

2. Prevent the “heavenly lighting” look. This is a classic error of even people who are in the media production business. The Prevent "Heavenly Lighting Look"problem occurs when someone on a Zoom call has a bright light behind them instead of in front of them or even to the side. This is seen often when someone sets up their laptop with day light streaming through a window behind them. The cameras installed in laptops respond to the brightest spot in the picture. With this setup the camera looks at the bright light and says “OK. I’ll adjust everything to that bright light!” The result is the person’s face goes dark. They might look like they have a halo around their head, but no one can see their face.

Solution: Avoid having a window or a bright light behind you. Position the laptop so that the window or bright light is in front of you.

 

I just love the paint on your ceiling3. “I just love the paint on your ceiling.” Here’s another pitfall. People place their laptop on a table that’s below their own line of sight (in a few instances it’s above their line of sight and everyone gets a shot of someone’s floor). As a result, the laptop camera is looking up at the participant and we also get a glimpse of the person’s ceiling.

Solution: Position the laptop screen so that it’s level with your eyes.

 

4. “If I knew you were coming, I would have straightened up a bit.” The COVID-19 pandemic has forced a lot of people to hunker down at home. Now Zoom has come into the home. In effect, your home has become a television studio, or at least a “If I knew you were coming, I would have straightened up a bit.”video location. Find a location for your Zoom meetings that present you in a semi-professional setting. I’m sure you’ve noticed that many reporters, experts, and pundits have chosen a place in their home that looks like a library, i.e., there are books in the background.

Solution: avoid setting up your location against a blank wall. Chose a place where’s there’s something of interest in the background. Often it’s a bookcase or something that reflects who you are. Be aware not to have anything distracting, or too personal, like family photos, or anything offensive in the background.

 

Just Say No to Vertical Video5. “Just say no to vertical video.” A minority of “Zoomers” use their phone to access the Zoom platform, but in-so-doing they position their phone in the same way they use it for other purposes: they hold it vertically. The result is a column visual effect, rather than the horizontal picture characteristic of laptop users. By holding the smartphone vertically the user creates blank bookends on both sides of the vertical picture in the middle.

Solution: Turn the smartphone 90 degrees to ether the left or the right. Tada! Now you have a horizontal picture with no black on any sides.

 

6. “Are you experiencing an earthquake?”  Often when using the smartphone to Zoom the user holds it in his/her hands or Are you experiencing an earthquake?walks around with it causing the picture to be in constant motion. This is very disconcerting for other viewers on the meeting with concentration and focus.

Solution: Before the meeting find a place to sit comfortably. Then find something to prop the phone up against such as a couple of books or take a look at these 10 DIY smartphone stand ideas.  If none of these work for you invest in one of the many smartphone stands on the market.

 

7. “There’s an echo, echo, echo. . .”There’s an echo, echo, echo. . .” Why do some participants sound like they’re in an echo chamber? The location you choose for your Zoom meeting is not only about how you look, it’s also about how you sound. A room that has things on the wall or behind you will allow your voice to sound more round and warm. An empty room will have you sound like you’re in an echo chamber.

Solution: choose your Zoom location carefully for both how you look and sound.  Choose a room that has carpeting and/or substantial items in it to avoid the echo effect. If necessary, add items to the room that you use. Be sure to place them in “off-camera” areas of the room.

 

8. “Can you hear me now?” We’ve all experienced someone’s audio level getting louder and softer as they speak. Participating in a Zoom meeting doesn’t have the same 8 Can you hear me nowaural effect as sitting around a conference table with colleagues in the same room. If you keep leaning back and forth or from side to side, this changes the distance between your voice and the microphone in the top part of your laptop screen.

Solution: Keep yourself pretty much in one position near the microphone when you’re talking during a Zoom call.

 

9.“Do you have to touch your face all the time?” Participating in a Zoom call reverses the usual speaker audience environment. Not only are you looking at the audience looking at you, but you are also looking at yourself. Presuming everyone Do you have to touch your face all the time?keeps their “video” option on, you’re also looking at how people scratch their nose, fidget, close their eyes, yawn, and do dozens of other non-verbal things people do when not talking (or even when talking). It’s a sociology exercise in observing how people behave non-verbally.

Solution: Be actively aware of your non-verbal behavior and to how others are reacting to what you say.

 

The reality is this: a Zoom meeting is not merely “the next best thing to being there.” A Zoom meeting is not just a formal (or in some cases an informal meeting) via electronic means. Inherently, a Zoom meeting is a video event and as such should be approached with a sense of “standards” if it is to be a fruitful meeting. Remember, Zoom meetings can be recorded by the meeting host, even a family meeting.

Sure, Zoom has enabled millions of people to communicate with each other all over the world where there is Internet access. Keep in mind, however, perceptions to the contrary, that only about 54% of the people on the planet have Internet access. So, in a very meaningful way, Zoom allows a little more than half of the world’s populace to stay in touch in the age of COVID.

This aside, “Zooming” will continue as an effective communications medium for the foreseeable future. Moreover, the concept is not new. Teleconferencing and tele-commuting has been around for at least 40 years, but now that laptops with built-in cameras and microphones are as ubiquitous as cellphones and smartphones, this new environment creates a demand for proper use. Practice Zoomiquette whenever the opportunity arises and you’ll have a better communications experience.

Eugene Marlow, Ph.D., MBA © 2020

Back to Top


How Old is Multimedia?

Sociologist William Fielding OgburnThe Marlowsphere (Blog #149)

It is a truism that laws more often than not lag behind cultural customs especially in times of change, to which we could add in times of rapid technological change. Sociologist William Fielding Ogburn (1896-1959) posited in 1922, for example, the difference between “material culture” and “adjustment culture.” The former refers to technology, the latter to the often lagging response to technological change on the part of members of a culture and its cultural institutions. In other words, technology is the primary engine of progress, i.e., change, and it takes time for people and institutions to catch up to the changes and characteristics new technology brings, especially when it comes to terms and definitions.

In academia adjustments to technological change with respect to programs, courses, and especially terms are more often than not “behind the curve,” never in front of it. Often, there is a tendency to grab on to a new technology well after it has been embraced by early adopters and to describe “new” courses with terms gleaned from the popular media without much aforethought.

I have observed this from direct experience.

In 1988 the Journalism Program at Baruch College (City University of New York) invited me to create and teach courses in video field production and radio news. I was the first professor in the program with a print and electronic media background based on my recently acquired Ph.D. and experience in video and radio production.

Sir Tim Berners-Lee Inventor of the World Wide WebIn reality, I was hired because the Speech Department (now the Department of Communication Studies) had initiated a course in “Corporate Video” and the Director of the Journalism Program (then a part of the Department of English) didn’t want to be outdone! In other words, inter-departmental competition motivated my being hired. Mind you, this was 1988, a year before (now Sir) Tim Berners-Lee introduced the World Wide Web which, in turn, began the slow but inexorable demise of print journalism. So, in some small measure, the then director of the journalism program (a full professor with a Ph.D. in English Literature, now retired) can be forgiven for not having a crystal ball to peek into the future.

It was not until 2007 (19 years after my hiring) that the department hired a second professor with expertise in electronic journalism. Her specialty was “multimedia journalism.” And it was only until 2016 and 2017 that a third and fourth professor with print and electronic journalism credentials were hired. The latest addition to the faculty has deep experience in podcasting. That’s four professors out of 11 full-time professors in 29 years, even though in this same period the world of journalism had moved inexorably to a greater reliance on visualization (video) and orality (podcasting) via the computer.

There was progress, however. The (now) Department of Journalism introduced a course in “Advanced Multimedia Journalism” following the establishment of a course in “Multimedia Journalism” which I also taught. There’s now two courses in podcasting.

A couple of years ago we were in the throes of a self-review in response to periodic accreditation requirements. One of the department’s “learning goals” (originally formulated in 2013) dealt with “multimedia.” My reaction to reading this learning goal was to immediately feel how out of date and mis-defined it seemed. It had been articulated in 2013 by a professor with no “electronic journalism” experience to speak of. This prompted me to look into the technical definition and history of the term “multimedia.” My search taught me again that all things have antecedents and confirmed that academia is usually behind the curve.

I discovered the “concept” and “term” multimedia is about 60 years old! Yes, it’s that old and it predates the advent of the personal computer. It’s also another example of what’s old is new again.

Bob Goldstein, SingerAccording to several sources, the term multimedia was coined by singer and artist Bob Goldstein (later ‘Bobb Goldsteinn’) to promote the July 1966 opening of his “LightWorks at L’Oursin” show at Southampton, Long Island. Goldstein was perhaps aware of an American artist named Dick Higgins, who had in 1964 discussed a new approach to art-making he called “intermedia.”

A month later, on August 10, 1966, Richard Albarino of Variety borrowed the terminology, reporting: “Brainchild of songscribe-comic Bob (‘Washington Square’) Goldstein, the ‘Lightworks’ is the latest multi-media music-cum-visuals to debut as discothèque fare.”

But wait! There’s more. Two more years later, in 1968, the term “multimedia” was re-appropriated to describe the work of a political consultant, David Sawyer, the husband of Iris Sawyer—one of Goldstein’s producers at L’Oursin.

The original meaning of “multimedia” kept evolving. In my 1995 book Winners! Producing Effective Electronic Media (Wadsworth Publishing Company) co-authored with "Winners! Producing Effective Electronic Media" by Eugene Marlow & Janice SileoResearch Associate Janice Sileo, in a chapter entitled “Multimedia” we wrote, “The Microsoft Corporation, in a February 1993 Backgrounder, defined computer-based ‘multimedia’ as ‘the integration of text, graphics, audio, video and other types of information. . . .’.” Further, “Clearly, multimedia has evolved from an integration of various digital, electronic, aural, and visual technologies into an interactive medium for use in the home and the office.” Sound familiar? 1993 is 29 years “after” the term was originally coined. Yet some journalism educators use the term and define “multimedia journalism” as if it were invented just a few years ago!

Clearly, the term “multimedia” has been bandied about and used by journalists and professors of journalism who have no concept of its origin or layered meanings. Further, the term “multimedia journalism” is likewise mis-construed. It should be “computer-based journalism” or “digital journalism. “ If used even more correctly, “multimedia” would also refer to film, broadcast and cable television. After all, these communication media combine sound with pictures and graphics and text of all kinds. This is an example of a more recent generation of professionals ignoring the fact that there are always antecedents.

But to ask these folks to appreciate the abovementioned distinctions might be too much. They perceive they’re in the technological vanguard and don’t want to be disturbed in their academic bubble. They haven’t done their homework. They’re in the caboose of a technological train—with a longer history than realized—whose engine is ahead of them.

©Eugene Marlow, Ph.D. 2020

Back to Top


Retreat from the Future

Retreat from the FutureThe Marlowsphere Blog (#133)

There we were in the second half of the 20th century, having experienced the defeat of Nazi Germany and Imperialist Japan that ended WWII, and watched in the late 1970s a pivot towards the west by Communist China following the 1976 demise of dictator Mao Tse-tung (Zedong), and in 1989 even as we watched the horror of Tiananmen Square, we also watched the disintegration of the Soviet Union and the fall of the Berlin Wall. We saw the creation of the European Union, the death of South African Apartheid, the shrinkage of nuclear weapons on a worldwide scale, the expansion of democracies, the diminution of illiteracy to about 15% of the world’s population, and the increase of global trade, so-called globalism.

We also saw a further exploration of space and the rise of a few non-governmental organizations investing in the exploration of space. The Higgs Boson, the so-called “god particle” was confirmed and the “repaired” Hubble Telescope peered closer and closer into the origins of this universe. The future of generations to come appeared to be bright.

But here we are in 2016 and the reverse appears to be true. In 2001, on 9/11, Al Qaeda terrorists took over two commercial airplanes and destroyed the Twin Towers in New York City (this wasn’t their first attempt). There are now terrorist groups in BREXITAfrica, e.g., Boko Haram, and in Russia, and in the Philippines, and in France, and in Belgium, among others. It has taken over eight years for the United States and other contingent countries to recover from the “mortgage crisis of 2008.” Britain has just voted to leave the European Union, the so-called “Brexit,” and in the United States the upcoming national election pits a politician, Hillary Clinton, with decades of regional, national and international experience, against an entertainer, real estate magnate Donald Trump, who has decades of experience on reality television. His vision of the future is to retreat from it by building walls between the United States and Mexico, to undo our trade agreements with other countries, and to (possibly) use nuclear weapons against our enemies (whomever they might be).

Elsewhere in the world, terrorists’ attacks have governments and peoples nervous about open borders and immigration issues resulting in the loss of jobs in one place only to turn up for less pay in other places. Local and regional wars have made millions of people homeless. The disparity between the so-called 1% (the haves) and the rest of the world (the have nots) grows deeper with every year. The rich are getting richer, the poor are getting poorer, and the middle class is getting screwed.

Further, the Internet is not creating a level playing field. It is allowing those with the technical know-how and marketing imagination to create platforms wherein users create content for free and the owners of the servers housing the platform very rich.

It is a gross irony that while some countries, like the United States, parts of the European Union, Japan, and China are exploring space both within and without our solar system, several tribal cultures, such ISIS and the Taliban, are more concerned with what women should wear in public, women’s subservient role in their society, and a strict adherence to the word of Allah. Also, in the Middle East extremist Jews and extremist Palestinians are at war over who should own what territory. In Saudi Arabia, despite their ostensible acceptance of western trade, there is an adherence to an anachronistic extreme form of Islam, so-called Wahhabism. The so-called Kingdom funds this form of Islam with millions of dollars and proselytize their point of view wherever possible.

Headline: Civil Rights Bill Becomes LawAnd in the United States, the chasm between predominantly white police forces and young mostly unarmed African-Americans appears to have exploded into the headlines and into television and radio news broadcasts in the last several years. The Civil Rights Acts were passed decades ago, but the blatant racism expressed by the shooting of unarmed black males by white police officers appears to have become a common occurrence, even in the historical context of an African-American president in the White House.

Q: Is America, is the world retreating from what seemed to be a        brighter future a generation ago?
A: Yes, it is. Or that’s the way it seems.

My view is that the world is experiencing a period of retreat from the future into a period of tribalism. And it is not recent. It has been building for some time, perhaps ever since the commercial introduction of the telegraph in 1844. This was the world’s first electronic communications medium that could transmit information from one point to another at the speed of light. In the mid-20thcentury photonic technologies were introduced (these are technologies based on photons as opposed to electrons). These combined technologies have bumped up the speed of communication and transport of goods and services and with them cultural values on a global basis. Cultures around the world where the literacy rate is lower and much lower than it is in more developed nations are repulsed by this invasion of outside cultural values. It is anathema to their entrenched cultural values. And, in turn, we are repulsed by their reactions, such as when we hear about honor killings in remote parts of India, and the mutilation of female genitalia in parts of Africa.

Even in places that are so-called developed nations there is a retreat into tribalism. The Brexit vote is one example, the rise of neo-fascism in Germany, and the increasing rejection of Islamic leaning peoples in France are other examples. It is a retreat borne out of deep fear—a fear that one’s family and community values are being tested, challenged, upended, and revealed as untrue or unfounded.

People don’t want change even when it is beneficial in the long-run to the greater whole. The speed of light technologies that now are increasingly circling the planet have thrown opposing cultural values into the same economic pot and have created such fear among the members of opposing tribes that it is engendering violence.

Global DiversityAnd this phase of planetary cultural evolution will not go away quickly. It will be with us for a while, perhaps a generation of two. Until peoples of different cultural stripes begin to accept that the future is about the integration of cultural values, even the loss and rejection of some values—such as religious and political beliefs—there will be a retreat from the future. Accepting that change is the constant, that change is the way of the universe, a universe we are just beginning to learn about, is a deeply painful process.

This view parallels the structure of scientific revolutions. First, there is rejection of facts that contravene the prevailing view, then there is anger and battles over what is true and what is not true, then ultimate acceptance of the new factual context. We are looking at a generation or two of battles over what is true and what is not true. If world history is any arbiter, progress will prevail, but only after many more have died for their antiquated beliefs and many more have died defending the values of the future.

Eugene Marlow, Ph.D.
September 5, 2016

© Eugene Marlow 2016

Back to Top


“Innovating and Integrating Information Technology: Some Cautionary Notes”

Innovation & IntegrationInnovation and integration may be the organizational trend of the day, but just as Rome wasn’t built in a day, neither is innovation, and integrating innovation into organizations is even more of a challenge.

Innovation and integration are challenging activities. If they were easy to accomplish, there would be a lot more of it, it would be commonplace, and we would be living in a completely different world. But the reality is innovation—whether technological, social, economic, political, legal, scientific, or medical—does not happen on cue, so to speak, and more often than not, it happens by accident.

The eventual invention of the telephone, the discovery of plastics, and the creation of Viagara are but three examples of a long list of accidental innovations, which includes: microwave ovens, the Slinky, Play-Doh, Super Glue, Teflon, the Pacemaker, Velcro, X-Rays, chocolate chip cookies, potato chips, Post-It Notes, Corn Flakes, and Penicillin. The discovery of galaxies outside our own Milky Way by astronomer Edwin Hubble in the late 1920s—the scientific realization that changed our world view—is also an example of an accidental discovery—five hundred years after Galileo developed the telescope.

On the other side of the “innovation” coin is when an innovation is ignored.  A classic example is the invention of the digital camera at Kodak. It was summarily dismissed by company executives because they perceived camera “film” was their core product. Too late did they recognize their blunder. In January 2012 it declared chapter 11 bankruptcy protection.

Innovation is defined as: “the action or process of innovating.” Synonyms for innovation include: change, alteration, revolution, upheaval, transformation, metamorphosis, break-through.

Integration is defined as: an act or instance of combining into an integral whole; or an act or instance of integrating an organization, place of business, school, etc.

It can be said with a high degree of certainty that technological innovations we take for granted are manifold and have, in turn, influenced the shape and customs of cultures. According to Ryan Allis, a technology entrepreneur and investor who has been part of iContact, Connect and Hive. (source:  http://startupguide.com/about-the-authors/), these include:

Technological Innovations 
1. The controlled use of fire (400,000 BCE)
2. Phonetic language (100,000 BCE)
3. Trade and specialization (17,000 BCE)
4. Farming (15,000 BCE)
5. The Ship (4,000BCE)
6. The Wheel (3400 BCE)
7. Money (3000 BCE)
8. Iron (3000 BCE)
9. Written Language (2900 BCE)
10. The Legal System (1780 BCE)
11. The Alphabet (1050 BCE)
12. Steel (650 BCE)
13. Water Power (200 BCE)
14. Paper (105)
15. Movable Type (1040)
16. The Microscope (1592)
17. Electricity (1600)
18. The Telescope (1608)
19. The Engine (1712)
20. The Light Bulb (1800)
21. The Telegraph (1809)
22. The Electromagnet (1825)
23. Petroleum (1859)
24. The Telephone (1860)
25. The Vacuum Tube (1883)
26. Semiconductors (1896)
27. Penicillin (1896)
28. The Radio (1897)
29. The Electron (1897)
30. Quantum Physics (1900)
31. The Airplane (1903)
32. Television (1926)
33. The Transistor (1947)
34. DNA (1953)
35. The Integrated Circuit (1959)
36. The Internet (1969)
37. Microprocessors (1971)
38. The Mobile Phone (1973)
39. The Smartphone (2007)
40. The Quantum Computer (2011)

The movable type printing press was successful through Johannes Gutenberg’s work in the 15th century. Moreover, his “invention”—a vast improvement over the Chinese version of 500 Illiteracy in the Worldyears prior—led to a growing list of books, magazines, and newspapers, and the Renaissance, the Reformation, the Industrial and Scientific Revolutions, and the so-called Information Age. These developments, in turn, increased the level of literacy in the world. Yet integration is not complete. Now in the 21st century there are still approximately 800 million people in the world who are illiterate—about 16% of the world’s population; mostly women in parts of Africa, the Middle East, and Southeast Asia. In other words, the spread, adoption, and integration of innovation takes time. Integration of innovations also takes time, not only in societies as a whole, but also in organizations, academic and otherwise.

Why? A recent article in the online “CIO Journal” section of the Wall Street Journal (dated June 17, 2015) has the headline “At Innovation Labs, Playing with Technology Is the Easy Part.” In this article the writer points out:

“For many companies with innovation labs, the road from eureka to real product isn’t . . . smooth. Often a lab builds exciting stuff but then is frustrated in bringing it to the broader organization for commercial use. . . .Classic management science dictates that stable, repeatable processes keep companies in business. Innovation, by definition, disturbs equilibrium, threatening what has gone before. . . . “You are causing disruptions to a system that has an immune response to repair those disruptions.”

In other words, how do you get the “. . .[organizational] body not to reject the [new] organ.”

"The Structure of Scientific Revolutions" Thomas S. KhunA seminal work on how the scientific world initially rejects discoveries that challenge the prevailing wisdom is found in the 1962 book The Structure of Scientific Revolutions by Thomas S. Kuhn.

This brings me to the world of academia. Putting innovation aside for the moment, integration of innovation is both inevitable and resisted. On the inevitable side of the organizational equation, all academic institutions have already integrated a host of technological innovations, for example: touchtone telephones, teleconferencing, computing, the Internet, e-mail and mobile phones. Without these electronic technologies no academic institution of higher learning, at least, would exist today. As a result, “the invisible affect” is anyone at the bottom of the organization can communicate with anyone at the top of the organization. The role of organizational middle managers as messengers has virtually vanished—a trend that began in the 1980s that has resulted in the flattening of many organizations.

On the other hand, academic institutions are typically organized into groupings, that is, by academic discipline, otherwise known as departments. Yes, these departments are further grouped into larger groupings, such as arts and sciences, or business, or public affairs, but inherently academic organizations are separated into “silos,” to use the prevailing terminology. And what is the level of collaboration and cooperation among these departments? In my experience, very little. Breaking down the imaginary boundaries among departments is a very difficult endeavor.

So, on the one hand, electronic technologies have created a breakdown of hierarchy of sorts and allowed academics to communicate with others on a global scale 24/7. On the other hand, the typical organization of academic disciplines by subject inherently creates internal resistance to change.

If technological innovation is to occur on a broader basis, more colleges and universities need to create what has been initiated at Lehman College in the Bronx, New York.

Lehman College has been selected by the American Council on Education/Change Innovation Lab LogoAmerican Council on Education (ACE) to participate in the Change and Innovation Lab (CIL), a program to help colleges and universities implement significant and sustainable initiatives to increase the number of first-generation and nontraditional students who gain a college degree. Lehman is one of nine institutions that will work during the 18-month CIL project to implement concrete steps on their campuses and identify how some of these practices can be applied broadly at colleges and universities across the country. The project is supported by a $400,000 grant from Lumina Foundation.

Second, and this is really the difficult part, colleges and universities on the undergraduate level need to re-think how students take courses and earn a degree. Online learning aside, is the organization of academic subjects by department the best way to go, or is it just the way it has been always done, and why change?

In a way, the organizational status quo is the line of least resistance. In another way, take any subject, from anthropology to zoology, from accounting to English, and you should find that no subject is pure in content. For example, accounting is a lot about “debit left, credit right.” But it is not just about numbers. It is about analytical thinking, and communicating conclusions in an efficient and effective way to someone else or a group of someone elses in written and oral forms. In other words, in this one subject, it is not just about crunching numbers, it is also about conceptual thinking, and communicating. You can throw in “ethics” as well, given the history of the accounting business in the last 20-30 years.

The integration of innovation takes lots of timeExpectations of short-term results from organized innovation efforts, and integration of these innovations into the organizational structure need to be tempered by the long history of homo sapiens’ technological evolution. The reality is innovation, more often than not, happens by accident and there are always antecedents. And integration of these innovations always takes time, a lot of time.

Eugene Marlow, Ph.D.
September 7, 2015

© Eugene Marlow 2015

Back to Top


Job Growth/Job Prospects for the Creative Class, Part III

Job Growth Architechs, Multimedia Artists & Animators ,& Video/Film EditorsMarlowsphere Blog (#123)

The last two blogs (#121/#122) took an overall look at job growth/job prospects for those in the so-called “creative class.” This is an appropriate time of year to do so given that May and early June are when college students attend graduation commencement ceremonies.

In this blog I take a specific look at those wishing to enter the fields of architecture, multimedia, and film.

Job Growth Outlook 2012-2022

  • Architects: +17%
  • Multimedia Artists & Animators: +6%
  • Film and Video Editors and Camera Operators: +3%

If any conclusion can be drawn from the numbers in this group of “creative class” disciplines, it is this: there is an ongoing seismic shift away from the printed word (and what it takes to create works with printed words) and towards the dominance of the visual image, generally speaking.

It is arguable that this shift has been taking place since the commercial introduction of the telegraph in 1844. Historically speaking, the telegraph (sometimes called the Victorian Internet) is the first significant electronic communications medium that subsequently led to broadcast radio, then broadcast television, cable, and then the Internet. In between these electronic technology developments we can point to early computers, desktops, laptops, satellites, CDs, DVDs, and now a host of mobile devices, et al. The list of electronic devices is overwhelming.

Shift of Printed Word to ImagesAt least one other conclusion can be readily identified. The explosive growth of electronic devices has, over a period of 170+ years shifted the dominance of the printed word—and all the professions required to accomplish the effective and efficient use of printed words—towards the visual image—and all the professions required to accomplish the effective and efficient use of visual images. In this context, it is no historical accident that since the end of WWII “image” has become more dominant than “substance,” or so it seems. the printed word

As with the previous two blogs, the following Job Growth/Job Prospects descriptions are taken directly from the pertinent Occupational Outlook Handbook (online) of the U.S. Bureau of Labor Statistics, Employment Projections program.
 
Architects: +17%

ArchitectEmployment of architects is projected to grow 17 percent from 2012 to 2022, faster than the average for all occupations.

Architects will be needed to make plans and designs for the construction and renovation of homes, offices, retail stores, and other structures. As campus buildings age, many school districts and universities are expected to build new facilities or renovate existing ones. Demand is expected for more healthcare facilities as the baby-boomer population ages and as more individuals use healthcare services.

Demand is projected for architects with knowledge of green design, also called sustainable design. Sustainable design emphasizes the efficient use of resources, such as energy and water conservation; waste and pollution reduction; and environmentally friendly design, specifications, and materials. Rising energy costs and increased concern about the environment have led to many new buildings being built with more sustainable designs.

With a growing number of students graduating with architectural degrees, strong competition for internships and jobs in the field is expected. Competition for jobs will be especially strong at the most prestigious architectural firms. Those with up-to-date technical skills and training in sustainable design could have an advantage.

Employment of architects is strongly tied to the activity of the construction industry. Therefore, these workers may experience periods of unemployment when there is a slowdown in requests for new projects or when the overall level of construction falls.

Multimedia Artists & Animators: +6%

Multi-media Artists & AnimatorsEmployment of multimedia artists and animators is projected to grow 6 percent from 2012 to 2022, slower than the average for all occupations. Projected growth will be due to increased demand for animation and visual effects in video games, movies, and television. Job growth [in the United States] will be slowed, however, by companies hiring animators and artists who work overseas. Studios often save money on animation by using lower paid workers outside of the United States.

Consumers will continue to demand more realistic video games, movie and television special effects, and three-dimensional movies. They will also demand newer computer hardware, which adds to the complexity of the games themselves. Video game studios will require additional multimedia artists and animators to meet this increased demand. Some of the additional work may be sent overseas.

In addition, an increased demand for computer graphics for mobile devices, such as smart phones, could lead to more job opportunities. Multimedia artists will be needed to create animation for games and applications for mobile devices.

Despite modest job growth, there will be competition for job openings because many recent graduates are interested in entering the occupation. Opportunities should be best for those who have a wide range of skills or who specialize in a highly specific type of animation or effect.

Film and Video Editors and Camera Operators: +3%
 
Video & Film EditorsEmployment of film and video editors and camera operators is projected to grow 3 percent from 2012 to 2022, slower than the average for all occupations.

Job growth is expected to be slow in broadcasting because automatic camera systems reduce the need for camera operators at many TV stations. Because of the public’s continued strong demand for new movies and TV shows, companies are hiring more people as the motion picture industry becomes more productive.

Production companies and video freelancers are working within new content delivery methods, such as mobile and online TV, which has led to more work for operators and editors. These delivery methods are still in their early stages, yet they provide an opportunity for operators and editors to showcase their work.

In broadcasting, the consolidation of roles, such as field reporters who edit their own work, may lead to fewer jobs for editors at TV stations. However, more editors are expected to be needed in the motion picture industry because of an increase in special effects and content.

Job openings are projected to be in entertainment hubs such as New York and Los Angeles because specialized editing jobs are needed there. Still, film and video editors and camera operators will face strong competition for jobs. Those with more experience at a TV station or on a film set should have the best prospects.

The job prospects for writers & authors, editors, and journalists will be covered in the next blog.

If you have any questions or comments about this or any other of my blogs, please write to me at
meiienterprises@aol.com.

Eugene Marlow, Ph.D.
June 1, 2015

© Eugene Marlow 2015

Back to Top