Edit Template

Column Archive

june07

A World Full of “Features”… I have a cell phone that came with a ninety-page instruction manual.  It has a built in camera and lots of sophisticated “features” for storing phone numbers, various shortcuts for dialing, text messaging, and of course taking photos.  Of the ninety-pages’ worth of instructions, I regularly use maybe three.  For me it’s a phone, not a camera or an Internet appliance.  I haven’t taken the time to enter other people’s phone numbers or acquired any special ring tones.  I have a digital camera.  It is also chock-full of “features”.  When the first single-lens reflex 35mm film cameras were introduced, the world was an elegantly simple place.  All one had to remember was to set the f-stop, the shutter speed, and focus on the subject of interest.  Most of us could remember to do these three things so we could concentrate on the process of making beautiful images.  Then came automatic exposure, automatic focus, built-in flash, and zoom lenses.  Some of those capabilities were useful – especially for people who just wanted to point-and-shoot and didn’t know enough about light levels and shutter speeds to take adequately exposed family photos.  But now, my relatively modest digital camera has a one hundred fifty page user’s manual.  There are six pages devoted just to a listing of all the menu options.  I have to carry the manual with me because I cannot remember where in the multi-level menus some of the options are buried.  And even then I am using maybe one-third of all these wonderful “features”.  And here I thought I was a pretty sophisticated photographer! I have a car that is in its seventh year of life.  It too has many “features”.  When purchasing this vehicle, I did not request these “features”.  They were considered standard equipment that according to the salesman had proven to be virtual necessities in these 2000-year models.  Many of these “features” are dependent on various electronic modules.  Well, I suppose it should be no surprise that the reliability of these modules can sometimes leave one a bit frustrated – as well as stranded somewhere other than at a repair facility.  It also seems that the car manufacturers have figured out that replacing these specialized “modules” can be a nice profit maker – perhaps even more lucrative than the sale of the original vehicle.  How can a part that is made from a two-inch aluminum cylinder with a simple semiconductor sensor mounted on a couple of wires across one end be worth $350?   That is an excellent question – it seems to me. Last week, I was requested to do some tests on a new LCD flat-panel television.  I looked on the back where I needed to make a simple video connection.  I was greeted with an array of input possibilities.  There was a cable connection, VGA,  DVI, and HDMI sockets,  an optical input connector, a full complement of RGB input plugs, a single video input, and several audio inputs.  I understand that versatility is good, but aren’t we getting a bit carried away? Given the complexity of these already existing products, surely we must be reaching the saturation point for adding more features to the devices that we already mostly don’t know how to use.  If you are seriously thinking that we are indeed approaching such an asymptote, you must have been — or still are — living on some other planet.  Having met the basic functionality requirements of just about every product that human creativity can envision, the trend is now to ever more complexity – namely more “features”.  Reliability and operational elegance have taken a backseat in this competition for the consumers’ attention.  More “features” are the driving force for market place success: Cell phones that have ever more computational and text messaging capabilities than the ones we have just barely learned to use; Cameras that can process and modify photographs even before down loading to a computer or printer; Cars that can park with no driver assistance and have a myriad of ever more complex performance, sensory, and entertainment features; And computers that have operating systems so complex and bloated that no one can understand how they operate or how to select the features we truly need and want.  No matter how much attention has been given to making each additional component reasonably reliable, when the complexity rises to this level, failures will occur.  Repairability will become an even more serious – and expensive –problem.  It already is a problem.  Component level repair is not possible with IC chips using hundreds of input/output connections wave-soldered onto boards full of other surface mounted components.  Not only that, most of these modules are custom made for a specific product.  Board-level repair then becomes the only viable path, but this depends on the future availability of these custom boards.  Once manufacturers decide to obsolete a board there is unlikely to be any other source of supply.   We are most likely entering a new era where there will be no functioning products older than ten years.  In the year 2030 there will be no operating cars from the first decade of this century, no cameras that can still take pictures, and no old televisions that we can maintain (or restore) and enjoy.  We will still have cars, cameras, and electronic devices from the 20th century, but not from the 21st.  There will, of course, be some examples sitting in museums, but they will no longer be operational or maintainable.   We are entering the era of the “throw away” society. Can this really happen?  Well, there is a rather nice historical example of a product that did not last as expected.  Some years ago, when color negative film was first introduced to the market most users did not realize that the film did not have acceptable archival capabilities.  Images made by amateurs and professionals alike began to fade within a few short years.  This problem was gradually solved, but in photography circles this time

june07 Read More »

july07

Wearable Electronics… There was a time in the not too distant past when the status symbol for a busy executive was to have a “car phone”.  This show of executive prowess was evidenced to the rest of us not-so-important managers by receiving calls from such higher-level executives as they traveled from location to location.  Then gradually, in the late ‘80s, the car phone became more common and affordable so that some of the rest of us could also get one as a show of our presumed importance and our need for location-independent communications.  Of course, as time went on, the car phone became more portable and we could actually carry one around with us as long as we had a briefcase the size of a large shopping bag.  And then, as the saying goes, “the rest is history”.   But before we move forward into the next exciting decade, let’s jump back to those days of yesteryear when the car phone was in it’s heyday. It was during this time, as I was attempting to look ahead at the next decade of display technologies and potential markets, that the following now-obvious insight came to me – “I don’t want to talk to your phone, I want to talk to you”.  What I was thinking by this statement is that when I make a phone call to your office, it’s not your office that I want to reach, it’s you.  Given this reasonable expectation, the direction toward wider use of cell phones becomes rather obvious.  This thought process led me to the identification of a product category that I began to call “wearable electronics”.   In the late ‘80s and early ‘90s, when I would mention this in my presentations, I would have to explain that my expectation was that wireless communications bandwidth would increase over the coming years and that there would be a convergence of communications and computers.   And furthermore, that there would be a convergence of communications and entertainment.  The general response to these predictions was, at best, polite non-enthusiasm.        Oh, if only I had been able to show my audiences the new Apple iPhone!   Or even a current version of the small cell phones with built-in cameras that we all now carry with us.  In the late ‘80s, these would have been seen as devices right out of science fiction movies.   The desire and need for us to communicate, acquire information, and be entertained as we move about is something that is finally beginning to be recognized and addressed with an ever-increasing breadth of products.  And in my opinion, we have only begun to scratch the surface of what we can expect to see over the coming two decades.   As an ever more connected worldwide society, we are becoming more mobile and less location dependent.  The concept of having an assigned work location and doing all of our business activities from “the office” is beginning to be recognized as not all that important.  Of course, certain professions that depend on laboratory or manufacturing facilities cannot as readily become location-independent.  But in time even those workers will not require a full workday be spent at a designated facility.  As more and more data is gathered directly onto computers that can transmit this data to any location, fewer people will be required on-site.  The second major trend driving location independence is the growing expectation that we can be reached any time of the day (or night) and that business can be transacted at all hours and all locations.  This has the good side for taking care of urgent – and sometimes not-so-urgent – matters, but also a bad side in that we cannot plan on any quiet time or vacation time.  We are unwittingly becoming full-time slaves to our communicators and computers.   Our only break from this is to add entertainment features to our portable electronic gadgets.  Thus, the avalanche to “wearable electronics” is beginning to thunder down the technology slopes.   Ever more capable communicators, portable GPS locators with enhanced features about restaurants and shopping locations, internet appliances with free access everywhere, and yet other devices that we can only begin to envision at this early stage.  What will, of course, be a common need for all these wearable electronic devices will be displays that are bright and sunlight readable, displays that can show all of this information with good resolution, and displays that consume relatively little power.  There is nothing currently on the technology horizon to indicate that battery storage capacity will increase by orders of magnitude in the coming decade so that displays will have to become more efficient to satisfy the power needs of all this information flowing into our wearable devices.  We have reached maturity in the development of displays for desktop computer and television applications.  Over the next few years, the displays for these applications will see continued incremental improvement and further cost reductions, but will not change all that much in their basic performance capabilities.  However, for “wearable electronics” the opportunities are virtually unbounded.  Much remains to be done with readability under all lighting conditions, power consumption, and form factors.  Flexibility and conformability are just beginning to be addressed and, to my knowledge, there are no products for consumer use that have been introduced with flexible or conformable displays.   There may even be opportunities for miniaturized projection technologies in this wearable electronics arena.  Therefore, we can expect “wearable electronics” to be a major opportunity for the display community over the next decade.  The Apple iPhone is perhaps the first irrationally exciting product that will get other manufacturers to recognize just how big and important this market segment can become. Should you wish to send me your thoughts on this subject, or perhaps others, using your portable communicators of course, you can reach me by e-mail at silzars@attglobal.net, by phone at 425-898-9117, or through a still location-dependent fax machine at 425-898-1727.

july07 Read More »

aug07

Modern Conveniences… We are entering an interesting new world of intelligent devices.  Cars that park themselves, appliances that do everything except insert the food into our mouths, and computers that give us near-instant access to nearly all the world’s knowledge.   Who would have thought, even just a few years ago, that we would be able to capture images from wherever we are and of whatever we are doing and instantly and effortlessly transmit them to our friends and family.  Well, maybe not entirely effortlessly just yet, but that time will also soon be here.  The path into the next decade is set for us to be presented with products having ever more intelligence – devices created to do the thinking for us. But should you perhaps not want all this thinking to be done for you, or if the device is limited in what it can do and you would like to provide it with some of your own intelligence, that is often where the “fun” begins.   For example, I have a reasonably sophisticated digital camera that has all the standard features of automatic focus, automatic exposure, optical zoom with a close-up option, and flash when the shutter speed falls below 1/30 second.  As with every digital camera, there are various menu options for changing the standard settings and for selecting a manual operation mode.  Thus, it seems reasonable to conclude that with all of these features and choices what more could one possibly want?  Who would want to go back to the days of having to decide on a shutter speed, select the optimum aperture, and then manually focus on the object of interest?  Well, here is a real life story that made me wish for a return to those pre-historic times when cameras had fewer “features” and virtually all of the “thinking” was left to the user.  My assignment was to photograph electronic products to illustrate their mechanical construction.  Since I would need to transmit the images electronically to another site, using a manual film camera in this case was not an option.   But then, why should we need to do that anyway?  Shouldn’t the newer products have more versatility without giving up any of the conveniences of the older products?   Well, at least in my naivete that is what I was thinking.  So let’s get to work.  The first problem I encounter is that the camera insists on focusing on some other part of the product than the one that I need to have in focus.  Well, that should be easy to fix — just switch to a manual focus mode and use a smaller aperture to increase the depth of field.  But how do I do that with this menu-driven electronic marvel?  I suppose, if all else fails read the instruction manual!   After a thorough study of the manual, I learn that “manual” focus does not really mean manual focus.  To this camera it means that I can select from five pre-determined areas in the viewfinder and then by way of the menu options choose one of these where the camera will then focus.  Well, no wonder that up to now I haven’t been able to get the desired objects in focus.  I wasn’t pointing the camera where the focus region had been pre-selected for me.  As I begin to realize, my camera has the ultimate decision making authority and will do what it wants to do, not what I would like it to do. Ha!  But there must be another way to outsmart this stubborn little electronic dervish.  I know, I’ll go into the manual mode and stop down to a smaller aperture to increase the depth of field.  That way the focal point won’t be as critical and I will have more options for how I compose my shots.  Back to the instruction book.  I learn that the camera has only two options for f-stops, an f2.8 and an f7.8.  To my surprise, there is a small footnote at the bottom of the page that states that depth of field “may not” increase if the smaller f-stop is selected because this “smaller” aperture is simulated using an electronic filter.  What do they mean “may not”?  If the aperture doesn’t change size, it can’t improve the depth of field.  In fact, it turns out that there is no real aperture in this camera at all.  To simulate the smaller aperture, it simply lowers the gain of the sensor array. The little electronic dervish wins again.  I give in and compose my photos to accommodate what the camera is going to do and not what I really would like it to do.   The final insult is that the flash goes off even when I don’t want it to because I forget to push one of the buttons four times whenever I turn the power back on.  The camera, of course, has reverted to its pre-programmed mode.  Oh, for a return to those good old days of products not quite so intelligent – products that left some of the thinking to the user.  In fact, after this experience I decide to go on a search to try to find a digital camera that has all of the capability of manual as well as automated photography in the true sense of the word.  I’m pretty sure that I should be able to find this capability in one of the newer digital SLR models, but before I make my next purchase I am going to make really sure that what are designated as manual modes are really “manual”. Not only that, but just for the record, I am still quite capable of parking my own car and making my own sandwiches.   I can also mow my own lawn and sweep out the garage when it needs it.  Wow, I must be a real pioneer.  I must have grown up walking for miles through the snow just to get to school – well actually it was only about a mile and

aug07 Read More »

sept07

September 2007 Flip Flops… It was an early evening at a major airport somewhere in the continental USA.  After a long day of working with a client, I was tired – too tired to do anything else except anticipate getting on the airplane for my flight home.   The thought of doing more work-related reading or even something not quite as demanding was, frankly, beyond my capabilities at that moment.  Given that condition of mindless weariness, there is not all that much that one can do except sit and wait for boarding time to arrive.   Thus, I began to observe the people around me.  Some were similarly just sitting and waiting, other aimlessly wandering the concourse, and yet others rushing to try to make a delayed flight connection.  As I watched and pondered, something began to make an impression on me.  Most of my fellow passengers were dressed about as casually as decency laws will allow.  Shorts and T-shirts seemed to be the predominant form of body covering.  And for shoes – well, it seems that flip-flops are now the accepted footwear for long-distance travel.  The overall impression was that most of my fellow travelers got out of bed that morning, put on whatever minimal clothing items were close by, slipped into their around-the-house flip-flops, and then forgot to change into something more acceptable before going to the airport.  It was really quite amazing to see people running to make a tight connection in flip-flops and shorts while dragging their roller-bags behind them.  How did this come about?  Whatever happened to dressing to look nice when out in public?  The Seattle Sunday paper has a feature each week showing a comparison photograph of a Seattle location today versus earlier times of fifty or more years ago.  One of the recent photographs was of the Pike Place Market taken exactly one hundred years ago.  The market itself looks remarkably similar.  But what has changed dramatically is the appearance of the people.  In the hundred-year old photo, the men are dressed in dark suits and wearing top hats.  The women are dressed in long dark dresses.  Even the vendors and delivery men are dressed in shirts, ties, and vests.  One hundred years ago, apparently when out in public one took pride in one’s appearance.  Who would have guessed that one hundred years later, we would all voluntarily walk around looking like impoverished peasants?  At this point, my idle thoughts took a more serious turn.  Did anyone twenty-five or fifty years ago predict that our future would look like this?  As I remember from the popular worlds’ fairs of the 1960s, the future was going to be considerably different.  People were going to be dressed in svelte jumpsuits of yet-to-be invented fabrics.  Everyone would be of ideal weight, in perfect health, and wear futuristically fashionable clothing both at home and out in public.  Wow!  What a disconnect from reality that turned out to be.  What else did we miss?  Well, we don’t have airplanes that fly at supersonic speeds.  We went to the moon some forty years ago and then instead of advancing — we forgot how.  Fusion power is as much of a dream today as it was in the last century.  The predicted twenty-hour work week with nearly unlimited leisure time has instead turned into a 24/7 work week for many of us; while others spend more time in traffic jams on their way to and from a ten or more hour workday.  On the other hand, personal computers and communication devices have progressed beyond anyone’s predictions.  And in our own back yard of display technology – well, it has perhaps progressed just about as predicted.  Even forty years ago, the idea of a hang-on-the-wall full-color television was in everyone’s imagination.  And over the last decade we in the display community have made that dream come true. From these examples, should we conclude that there is no way that we can reasonably predict how the future will turn out?  How could we not have realized that instead of svelte people in form-fitting jump suits, we would be living in a world of flip-flops and minimal clothing — containing people carrying way too many excess pounds?  I have to admit that while I think I can explain the excess weight caused by too many hours in front of the television and/or computer screen (with food readily at hand) I am challenged to try to explain the recent clothing trends.   What societal force has caused this lack of interest in personal appearance?  It seems contrary to what we worship as our role models in the entertainment industry.  Why don’t we want to be more like them?  Some do try –even through extra help from cosmetic surgery.  But, on average, that sure does not show up while sitting in a gate area at a busy airport. What effect, if any, will this have on future product and technology developments?  Are we going to prefer to interact more with computational devices than with each other?   Do we need to begin to consider a world where people lead a real life and then supplement that with a more desirable computer generated one?  There are some gradual changes taking place that I don’t think we yet fully appreciate.  If the current trend continues, what will we all look like in another twenty or thirty years?   I must admit that this one puzzles me.  How much more casual (i.e. sloppy) can we get?  I shudder to think. Oh well, at least we in the display industry can provide all of these folks with a variety of displays and entertainment devices.  That way no one has to look at anyone else.  They can just enjoy the beautiful virtual life images on their display screens.  If your crystal ball is telling you a better story, please let me know.  I would be interested to have you share your insights.

sept07 Read More »

oct07

Technology Asymptotes… Recently, I read an article about an exploratory effort in Japan by NHK to develop a new higher resolution television system.  The NHK Super Hi-Vision system is designed to deliver images with 8K x 4K resolution with a 16:9 aspect ratio.  As explained in the article, the objective is to be able to have a 100-inch display and not have the individual pixels be visible from a distance of one meter.  Wow!   Will we really be able to appreciate such spectacular images given that the current HDTV system is already better than the practical resolution of film images we have become so accustomed to seeing in movie theaters?   Pondering this led me to contemplate the broader question of whether there are limits when we no longer have the need or desire to push for further improvements – or perhaps the product is already so good for its intended purpose that we will not pay for anything better.      Perhaps we can gain a useful insight or two by taking a look at what has happened in other technology areas?  Film cameras reached their practical limits of resolution many years ago.  Some of the lenses from the 1930s and 1940s achieved resolution levels as good as anything that is available from “modern” optics.  Instead of pushing for further refinements in resolution, lens designers found a more receptive market for added features such as variable focal lengths (wide angle and telephoto “zoom” lenses), larger apertures, and auto-focus.  In trying to balance between versatility and resolution, it was not uncommon to actually have the resolution get sacrificed to some degree.  The camera makers learned where the optimum balance was between film resolution, lens versatility, and manufacturing cost.  That led to many years of products being introduced that continued to be improved in many aspects, but lens resolution was not one of them.   Thus, today, most images taken with professional quality 35-mm film cameras fall short of the equivalent of HDTV resolution.  For those professional photographers who need to produce higher quality photographs for use in glossy magazines or for art gallery displays, a niche market developed using larger film formats such as 2 ¼ square or 6×7 cm.   A technology asymptote was achieved and sustained for many decades.   Let’s now look at a more recent but closely related example.  Some years ago I wrote a column predicting that two megapixel imagers would be sufficient for digital cameras since they would produce images of comparable quality to 35-mm film.  Clearly, I was too conservative in my prediction.  To get to the 2 megapixel number, I was trying to balance what I estimated to be acceptable picture quality with the capacity of storage devices available at that time.  I did not anticipate that the camera makers would get into a “horsepower” race to see who could introduce a camera with the next higher megapixel number.  Fortunately, the cost of storage continued its rapid drop so that the huge image files that resulted from the 5 – 10 megapixel imagers are no longer all that difficult to manage.  But the pixel count race is also finally reaching its technology asymptote.  We seem to be settling on 10 megapixels as the magic number for “good enough”.  That is sensible since the lenses that are sold with those cameras are marginally adequate to fully utilize even this image resolution.    In other areas of electronics, we have seen similar technology asymptotes come to pass – often with frustrating results for product manufacturers and resellers.  Consider, for example, the market for audio components.  Audio signal reproduction has become so precise and distortion free that it is no longer possible to hear the difference between many of the system components such as amplifiers, tuners, and CD players.  Speakers, as electromechanical devices, have not yet been able to achieve such perfection, but the end result has been that about the only way to distinguish products is on their styling, audio power output, and price.  Since we can no longer hear the difference, and because profit margins on these components are now so small, many resellers have adopted shady practices to increase profits – such as selling connecting cables and speaker wire at exorbitantly inflated prices with the claim that these accessories will bring out the “full audio capabilities” of the system.  In the mid-eighties, we were introduced to the first personal computers.  The IBM PC and the Apple II gave us our first glimpse into a future that would soon be upon us.  The IBM PC and it’s clones had a clock speed in the vicinity of 10 MHz and a hard disk that could store about 20 Megabytes.  Not so many years after that, we were introduced to desktop computers with 100 MHz clock speed, then 500 MHz, and then the magic 1 GHz number was achieved.   There were plenty of predictions for when we would see 10 GHz clock speeds and beyond.  Having started my career as a microwave engineer, I knew how difficult life can get as one tries to work with signals in the many GHz range.  Well, sure enough we made it to about 3 or 4 GHz and then life got really difficult – we hit the wall.  The speed race ended and we were forced to switch over to multi-core processors to continue to increase computational capabilities.  Now the race is on to see who can introduce the next highest number of cores.  And in fact, we never even achieved the 3 or 4 GHz operation.  That speed is only for the arithmetic unit within the core of the processor.  The information that is swapped in and out of memory is typically at speeds of well under 1 GHz.  So again we have approached and reached a technology asymptote.  It seems that with every product and every technology there comes a time of either “good enough” or that further increases are so difficult and/or so expensive to achieve — or are of such

oct07 Read More »

nov07

Technology Momentum… One of my memorable experiences from the time I spent in Princeton was standing on the platform at the Princeton Junction train station and watching the Amtrak Metroliner trains come blasting by at full speed.  The platform at Princeton Junction is quite close to the tracks so one gets to experience the full effect of a massive locomotive going roughly a hundred miles per hour.  The Shinkansen trains in Japan also produce this same sensation of unbridled power as they hurtle through stations just a few inches from the platform.  Momentum is defined as mass times velocity.   A large locomotive has plenty of mass and at one hundred miles per hour it also has considerable velocity.  Our brains seem to have an innate ability to make the momentum calculation.  Each time I would watch one of these trains hurtle past, I would instinctively imagine how a bug must feel just before encountering the windshield of a car going at freeway speeds. So how should we interpret the title of this month’s column – technology momentum?  And why might we find it important to understand the implications of this concept?  In applying the concept of momentum to a discussion of technology, the equivalent of “mass” is the size of any existing enterprise or institution, and the equivalent of “velocity” is the rate of change in any given technology.   The larger an existing enterprise is and the faster it is moving, the more difficult it becomes to disrupt its success.  And if the existing enterprise is supported by a massive infrastructure, then it does not even have to move all that fast in order to have virtually unstoppable technology and business momentum.  Consider, for example, the competition between Silicon and GaAs in the semiconductor business.  For many years, it was predicted that GaAs would provide stiff competition for Silicon-based products in any application requiring high-speed signal processing.  However, GaAs never became a mainstream technology even for these applications.  The Silicon infrastructure was so massive that nearly every product that the GaAs technologists brought to market was soon eclipsed by the more powerful silicon technology base.  The GaAs technologists could never achieve a high enough technology change “velocity” to overcome the more massive resources of the silicon infrastructure.  Today we are experiencing the same phenomenon in software development.  While we have all evolved to using our computers as primarily communication devices, we see Microsoft bringing yet another fancier version of Windows to market that is of little practical use to most of us.  Yet the revenue stream continues to grow for Microsoft because this huge speeding train is just about unstoppable.  It will take a further major revolution in computer usage before this train derails.  Certainly such derailments have happened in the past.  Consider for example Wang and DEC.   Each was a successful company until what they had to offer was replaced by a new capability that made them obsolete.  We can of course add many other examples such as Polariod, the film business at Kodak, and what is currently happening to CRTs. Applying the concept of technology momentum, we can see that what finally caused the massive existing technologies to be replaced were new technologies that came along offering new developments at a faster pace or capabilities that the existing technologies simply could not address.   An early hint that CRTs would not always be the dominant display technology came when the first laptop computers appeared with LCD monochrome screens — with poor contrast and slow response.  Even thought they were markedly inferior to CRTs in image quality, they represented the only way to achieve the desired portability. Now, suppose that you are a clever inventor with a new idea for a superior display device.  Let’s say that you have demonstrated in your laboratory that you can bring to market a superior large-screen television.  Your preliminary estimates show that it will also be cheaper to produce products based on your idea than the current crop of large LCDs and Plasma panels.  What are your chances of success?  Unfortunately, not very good.  The existing companies making large-screen televisions are multi-billion-dollar enterprises.  Supporting them is an equally large infrastructure of production equipment suppliers.  All this represents a massive worldwide business infrastructure.  Not only that, the progress in improving image quality and driving costs down continues to be rapid.  So the momentum of these combined organizations is truly awesome.  In order to overcome this huge existing technology momentum, your tiny organization of a few engineers would need to have near speed-of-light velocity.  This would mean that what you have invented must be truly revolutionary.  An improvement of a few factors of two or even an order of magnitude may not be enough.  Of course one possible solution could be to approach one of these large enterprises to see if they would like to license your idea.  That may work if you do it with some care.  A less desirable outcome could be that they simply implement your idea and then use their near-infinite resources to make it difficult and very expensive for you to obtain legal relief.  This technology momentum can be scary stuff – just as scary as watching that massive train suddenly hurtling past just a few inches from the platform where you are standing.  On the other hand, just as it is important to be standing on the platform and not on the tracks when this event takes place, it is important to understand the dynamics of these technology-based businesses. In the worldwide display community, we are currently in a period of massive manufacturing build-up of both LCD and Plasma technologies.  We are also beginning to see the first hints of serious efforts to bring OLED technology to market.  Since this new OLED effort is originating from some of the larger companies, they have what it takes to create the momentum to make this a potential success.  They have the “mass” in terms of resources and OLED technology is demonstrating good “velocity”

nov07 Read More »

dec07

Speak to Me, Greta… “Keep right, take the next exit, then turn right.”“Continue for 200 feet, then turn left onto Main Street.”“RECALCULATING!”.“Turn right and continue for 100 feet, then turn right onto Main Street”.“RECALCULATING!”“Turn left, continue for 100 feet, then turn onto Main Street”.“Approaching destination.” “Thank you, Greta. My apologies that I was in the wrong lane and couldn’t get over in time to make the turn as you instructed.” As you have undoubtedly already concluded, Greta is not a real person but the portable GPS navigation unit that we take with us on our travels by car. So why do we call this electronic appliance Greta? Well, it’s because she seems to be developing a personality. Or at least I am ready to endow her with one. The especially endearing part is when I don’t follow her instructions as in the example above. When she says, “recalculating” there is just that slight tinge of sarcasm in her voice. Any day now I expect her to say something much more explicit such as “How many time do I have to tell you?” or “If you can’t do this easy stuff, what will we do when we get into city traffic?” But, so far, her only truly sarcastic comment has been, “A better route is available!” And I really do appreciate her not ending this observation with something like “you dummy”. As we approach this Christmas season, when purchasing decisions for all kinds of toys and gadgets are weighing heavily on our minds, it’s worthwhile to look ahead at what’s in store for us as we approach the end of the first decade of the 21st century. I am predicting that the next decade will be the decade of electronic machines and toys that begin to have selectable personalities and we will begin to interact with them at about the same level of affection as we show some of our colleagues and certain casual friends. Real intimacy may be another decade or two away, but for now casual relationships will at least get us started. Children are already beginning to grow up with dolls and other simulated personas that allow for learning experiences not only on a factual level but also an emotional one. And soon we adults will begin to get comfortable in relating to our electronic gadgets in a way that a few years ago would have seemed borderline insanity. No more. Recently, I read a newspaper article that people were beginning to treat their robotic vacuum cleaners as pets. They were even dressing them up with special decorative “clothing” and taking extra care to make it more convenient for these robotic vacuums to get around their homes. And yes, they too were giving them names. Another interesting observation made in this article was that people became more tolerant to a product having a failure once they developed this emotional attachment. Will it be long then, before our automobile will say to us, “You know, I really feel lousy this morning. My engine exhaust sensor just picked up an extra high level of carbon monoxide emissions and you know how I hate that!” “I don’t know about you, but I’m not about to leave this garage — unless you promise to drive me straight to the service guy.” “And by the way, I should tell you that the last time he checked me over, he really made me feel good.” The actual technology to do this is pretty much already in place. The engine computers are recording these malfunctions that can later be accessed by any service shop or anyone else with the proper software. So why wait for the service person to tell us what the computer has already decided? Why not just have the car tell us “in person”. I think we all would actually appreciate that. And for now, if the conversation is a bit one-sided because the car can’t relate to our crudely expressed opinion of why the sensor should be picking up this information and how we are supposed to get to our 9:00 am meeting on time, then maybe that’s just as well. I’m actually somewhat surprised that this transition to talking appliances has not started to happen sooner. We did have a few early attempts with cars that incorporated voice commands, but they were more irritating than helpful. The information that was being verbalized was too basic and too redundant with what we could easily see on the instrument panel. But since then, the world has changed — thanks mostly to cell phones. Just a few years ago, when people were seen in public loudly talking to themselves, this was considered seriously nutty. Today it’s the sign of a busy executive taking care of business – even while standing in a public restroom. If we can carry on a conversation while doing that, what could possibly be wrong with having a vacuum cleaner that tells us how it’s feeling today? “My sinuses are really plugged up from all that dust I picked up yesterday – I sure could use a new filter.” “Thanks, you’re sure a good master”. So, as we once again enter the season of frenzied acquisition of toys and gadgets, we should begin to think about the mini-personalities that we will soon be accepting into our lives. This year we’re at the threshold of infancy for these gadgets, but their birthrate will accelerate and they will become more like us with each passing year. It’s actually going to be great fun. A device with a built in personality can, if nothing else, be more predictable and more patient with us than the “real” personalities we sometimes have to deal with in our daily lives. Will all this have an effect on displays and display technology – absolutely. Along with these mini-personalities will come software-created personas that will populate our displays. They will become like companions to us and be transferable from device to device. The scary part is

dec07 Read More »

jan08

If At First You Don’t Succeed… The traditional version of the saying goes, “If at first you don’t succeed, try and try again”.  The expectation is, of course, that such persistence will pay off in eventual and well-rewarded success.  However, some time ago I heard another version of this saying that is perhaps equally valid, “If at first you don’t succeed, give up — no sense making a fool of yourself”.  Should we take a statistical approach and do something in between?  Or will that just lead us down the path to a life of mediocrity? What led me to contemplate these two versions of folk wisdom are the recent efforts of major companies to re-introduce products that seem to be in this mode of “try and try again” but with no clear vision of their eventual success.  The two products that I am specifically thinking about are the “electronic book” and the “tablet computer”.  Both have been around for a number of years, but have not yet made any significant inroads in replacing either conventional books or conventional mouse-and-keyboard computers.  E-book introductions have, over the years, been tried by a number of companies while the tablet computer has been mostly a Microsoft dream.   Now Amazon.com is making another run at the e-book and Microsoft is suggesting that the time may finally be right for the tablet computer.  Why is it that some products catch on almost immediately and others languish to eventually just fade away?  And if a product is not a success initially can it become a success at a later time?   Flat panel displays certainly have not needed any great promotional efforts to make them a success.  Consumers fell in love with the thinness of these displays even while the images were inferior to those produced by traditional CRTs.  It seems that those products able to achieve almost immediate acceptance are the ones that then continue to grow and flourish.  Conversely, I can’t seem to come up with any noteworthy examples of products that languished and then finally took off.  Sometimes there are price barriers that limit growth, but that is typically only a temporary and obvious obstacle.     For a number of years, I have wondered what fundamental problem the e-book will solve that would make it a highly desirable product.  For example, the laptop computer made “anywhere” computing possible.  That was immediately important to travelling professionals and soon became important even to students.  The only problem I can see the e-book solving is to provide reference libraries to those who need them during their normal business activities, e.g. attorneys.  However, the laptop computer can do this function at least as well and perhaps better.  So then we are left with a device that is supposed to replace a conventional book – one downloaded copy at a time.  A conventional book is generally not any heavier than an e-book, it’s impervious to all kinds of abuse, and it can be passed on to others for further use or stored away on a shelf as a visible reminder of knowledge gained.  The printed text is easy on the eyes, the paper pages are soft and comfortable to hold, and browsing and skipping around are easy to do.   Would the e-book be desirable because it’s cheaper?  Or could it be beneficial because access to a new download is easier than buying the print version?   Somehow it just doesn’t seem to have that key ingredient causing consumers to rush out and buy one.  Is it possible we are still missing something here?  Is there something unique that will come along to change the dynamics of this technology?  Certainly the displays used in these products are by now quite adequate and readability is not an issue.   Battery life seems to be just fine as well.   So what will it take?  Or is it one of those products that will attract a small following but never become a mainstream technology?  It seems that the more time passes the more difficult the path to success becomes.  The tablet computer may be in this same precarious spot.  Is there a unique problem that it solves?  If so, what is it?  Could it be a replacement for my laboratory notebook?  If so, will I be able to do everything I now do on a paper page, but even better and faster?  What useful new capabilities will the tablet computer provide that I do not already have on my laptop or desktop computers?  I have seen Microsoft managers holding these tablets during presentations and using pen-like styluses to access their PowerPoint slides.  But watching them cradling their tablets in an uncomfortable bent-arm position for an extended time did not cause me to want to rush out and get one.  Will Microsoft come up with something that these gadgets can do that we cannot already do more easily?  If not, then this product too may end up in the junkyard of wonderful but useless ideas.  It’s actually quite amazing that when consumers find a new product they “simply must have” how willing they are to put up with all kinds of imperfections.  The early laptop computers had terrible looking displays.  They were monochrome with low contrast and of limited size.  Basically, they were just barely useable.  But since this was the only way to get portable computing, we were willing to accept these deficiencies.  Today, we see the same amazing user adaptations with text messaging.   The keyboards are tiny and as a consequence we have had to develop incredible thumb dexterity.  The multi-letter buttons also lead to abbreviation skills that would make a court reporter proud.  Many of us can no longer live without a continual stream of communications coming in and going out.  “Have you called your girlfriend today?”  “No, but I sent her three text messages”.  Definitely, a new approach to romance!   It seems that consumers are actually amazingly capable at selecting what we find useful and conversely what we can just as

jan08 Read More »

feb08

Try and Try Again – Part II, In 3D ……February 2008 In the January column, we examined two products that seem to be in the “try and try again” category, namely the e-book and the tablet computer.  Then recently, I came across an announcement that literally shouts to be added to this category – the formation of a consortium to promote 3D for home entertainment.  Can it be?  Will we soon be watching TV in our homes in 3D?  About fifty years ago, there was an attempt to introduce 3D movies in theaters.  After the initial burst of enthusiastic publicity, the technology did not succeed.  The need for polarizing glasses, the resulting eyestrain, and the artificiality of the “doll house” effect from the stereoscopic images, made 3D movies a one-time novelty.  Audiences found the 3D experience to be more of a distraction than an enhancement to an immersive viewing experience.  For movies that had a serious story line, the 3D effect turned out to be more of a distraction than a way to be further drawn into the story.   That is not so surprising given that the addition of two-view stereo creates conflicts within our visual system due to the lack of corresponding depth-of-focus, eye accommodation, and head-movement simulation. Is there something different in today’s display technology, or in our viewing habits, that would lead to a success after fifty years of hibernation?  The fundamental technology for producing stereo images has not changed all that much in the last fifty years.  We still need to use polarizing glasses for a reasonably realistic effect.  The auto-stereoscopic viewing systems that claim not to require polarizing glasses are really not very good.  They truly are still in the novelty category.  And the stereo effects are still limited to only a single viewing plane that does not take into account our eyes’ accommodation for objects at different distances, nor for the parallax shifts as we change our head position.   If the technology for producing what we wishfully call 3D has not changed, then what could possibly be the reason for thinking that there is a market for 3D home entertainment?  The answer may lie in broadening what we usually think of as “watching TV”.   What is new that didn’t exist fifty years ago are computer games and other computer generated images.  When playing computer games, the fundamental premise is already an artificial environment.   So the addition of a 3D effect that is only a rudimentary imitation of reality can be quite acceptable.  This could work especially well for games such as those produced by Nintendo that allow the user to interact with a hand-held wand that simulates the playing of a tennis match or the swinging a golf club.  However, reading the announcement for the 3D Home Entertainment Consortium, I did not detect the recognition of this new direction.   As best I could tell, the stated purpose was primarily developing standards and methods for the distribution of conventional movie-style entertainment, but in 3D.   Oh well, the good news is that it really does not matter.  Consumers will decide what they like and how they will use 3D technology – or not.   From a consumer standpoint, it will be perfectly acceptable if the 3D technology is pushed by the conventional movie industry and then Nintendo or someone else quickly adopts it for computer games.  The movie industry may not be so enthused about the lack of use for what they hoped would be a new market, but consumers will be happy to have a new display technology at their disposal.  For playing computer games, having to wear polarizing glasses while holding a wand is really quite acceptable.  The approximate 3D effect produced by a two-view stereo image may also be acceptable for watching other content that is computer created and inherently artificial.  But for the typical movie where the story line should be the predominant focus, it seems that we have nothing to offer today that is different from the failures of fifty years ago.  For those of you who think that 3D is the next generic viewing experience, I would be interested to hear your thoughts of how and why that will happen.   In the meantime, I will be just as happy to see our rudimentary 3D display technologies enter the home entertainment scene through computer games, flight simulators, and other such “unreal” images.

feb08 Read More »

march08

Try and Try Again – Part II, In 3D In the January column, we examined two products that seem to be in the “try and try again” category, namely the e-book and the tablet computer.  Then recently, I came across an announcement that literally shouts to be added to this category – the formation of a consortium to promote 3D for home entertainment.  Can it be?  Will we soon be watching TV in our homes in 3D?  About fifty years ago, there was an attempt to introduce 3D movies in theaters.  After the initial burst of enthusiastic publicity, the technology did not succeed.  The need for polarizing glasses, the resulting eyestrain, and the artificiality of the “doll house” effect from the stereoscopic images, made 3D movies a one-time novelty.  Audiences found the 3D experience to be more of a distraction than an enhancement to an immersive viewing experience.  For movies that had a serious story line, the 3D effect turned out to be more of a distraction than a way to be further drawn into the story.   That is not so surprising given that the addition of two-view stereo creates conflicts within our visual system due to the lack of corresponding depth-of-focus, eye accommodation, and head-movement simulation. Is there something different in today’s display technology, or in our viewing habits, that would lead to a success after fifty years of hibernation?  The fundamental technology for producing stereo images has not changed all that much in the last fifty years.  We still need to use polarizing glasses for a reasonably realistic effect.  The auto-stereoscopic viewing systems that claim not to require polarizing glasses are really not very good.  They truly are still in the novelty category.  And the stereo effects are still limited to only a single viewing plane that does not take into account our eyes’ accommodation for objects at different distances, nor for the parallax shifts as we change our head position.   If the technology for producing what we wishfully call 3D has not changed, then what could possibly be the reason for thinking that there is a market for 3D home entertainment?  The answer may lie in broadening what we usually think of as “watching TV”.   What is new that didn’t exist fifty years ago are computer games and other computer generated images.  When playing computer games, the fundamental premise is already an artificial environment.   So the addition of a 3D effect that is only a rudimentary imitation of reality can be quite acceptable.  This could work especially well for games such as those produced by Nintendo that allow the user to interact with a hand-held wand that simulates the playing of a tennis match or the swinging a golf club.  However, reading the announcement for the 3D Home Entertainment Consortium, I did not detect the recognition of this new direction.   As best I could tell, the stated purpose was primarily developing standards and methods for the distribution of conventional movie-style entertainment, but in 3D.   Oh well, the good news is that it really does not matter.  Consumers will decide what they like and how they will use 3D technology – or not.   From a consumer standpoint, it will be perfectly acceptable if the 3D technology is pushed by the conventional movie industry and then Nintendo or someone else quickly adopts it for computer games.  The movie industry may not be so enthused about the lack of use for what they hoped would be a new market, but consumers will be happy to have a new display technology at their disposal.  For playing computer games, having to wear polarizing glasses while holding a wand is really quite acceptable.  The approximate 3D effect produced by a two-view stereo image may also be acceptable for watching other content that is computer created and inherently artificial.  But for the typical movie where the story line should be the predominant focus, it seems that we have nothing to offer today that is different from the failures of fifty years ago.  For those of you who think that 3D is the next generic viewing experience, I would be interested to hear your thoughts of how and why that will happen.   In the meantime, I will be just as happy to see our rudimentary 3D display technologies enter the home entertainment scene through computer games, flight simulators, and other such “unreal” images.

march08 Read More »

Scroll to Top