Our Technology Future - Consumer Tech
So, it’s hard to imagine
discussing the future of technology without looking at consumer tech. I can
promise you one thing… this post will not be promising much you couldn’t figure
out yourself.
When it comes to consumer
technology, predicting the future is a risky gambit. You either bet on the low
hanging fruit, or you go out on a very flimsy limb. The reason is that it’s
difficult to judge what the general public will latch onto or what ideas tech
companies will try to push. Partly you could say I just have no imagination.
For example, the Ultrabook
phenomenon is incomprehensible to me. What is an Ultrabook? It’s a lighter,
thinner, smaller, laptop. That’s all it is really. They select a different set
of CPUs to keep thermals down, usually strip out the optical drive, but and
they are all between 10” and 14” in screen size. But that really amounts to
nothing but a smaller laptop. So why market it as an “Ultrabook”? Because it
gets more money and attention that way. It makes it sound like some kind of
amazing new technology. It’s just a smaller laptop. The term “Ultrabook” is
mainly just for marketing purposes. Lighter, thinner, laptops have been a trend
since the introduction of the laptop computer. Renaming them something else now
holds no real tangible meaning.
Another example is the
apparent clamor for Smartwatches. Samsung and Qualcomm have only in the last
couple days revealed their own plans for the new devices. Quite simply you wear
the device on your wrist and it operates in conjunction with your Smartphone to
relay notifications or short messages that would otherwise be on your phone, onto
the screen of the watch. To me, it’s another useless device. You have to have
the phone in your pocket. If you get messages on your phone it rings or
vibrates. Why do I also need my watch to tell me that I got a message? Are we
really to the point where it’s too much trouble to pull a phone out of our
pocket and glance at the screen?
These examples show the unpredictability.
They come out of the ether and hold indeterminate sustainability or staying
power. But what of PCs? There has been a lot of talk of the “end of PCs” or a “post
PC era” that has begun in which we will cease to use PCs. Is there are reality
there? No.
Admittedly I am a little biased
because I love PC technology. But part of the reason for my determined
statement that PCs are not going anywhere is because the talk about PCs going
away is bunk. PCs are changing, not disappearing. You say the word “PC” and the
first thought is the black tower on top or next to our desk with the screen and
keyboard and mouse attached to it. But laptops are PCs, tablets are PCs,
netbooks, notebooks, even Smartphones are all essentially PCs in a different
form. I argue we should stop calling them “Smartphones” since very few people
seem to even care about, much less use, the “phone” capabilities.
Like with the laptop
versus Ultrabook thing, the goal for PCs has always been to make them smaller
and more mobile while retaining or enhancing power. The laptop itself was a
step in this process. However, there is a basic fact, which is that something
larger will have more power. The chip inside of a Smartphone is about the
smallest functional consumer processor. Put several of them together, working
harmoniously, and you have a more powerful system. You start following that rationale
and you eventually get a room with a supercomputer. The question is mostly about
what you want to do.
That is the key to the
consumer tech future; what will we want to do? We’ve seen in the last few years
the attempt to make 3D a part of the average household. That is the response to
the public’s desire for immersive entertainment; the feeling of being right
there in the action whether it’s a movie or a sporting event. That has led to the
advancement of higher definition television. But as we’ve seen, like occurred with
the last attempt to bring 3D technology to the masses in the 70s and 80s, the
technology sounds much better than its application provides. 3D TVs are very
expensive, and produce very questionable results. Many people chose to go to regular
display movies instead of the 3D ones, and of the very limited 3D options for
TV have dried up due to very low pickup.
Curved OLED screens and 4K
resolution are the newest advancement in screens.
4K is the more tangible in
the immediate term. It is a resolution of 3840x2160 (as opposed to 1080p which
is 1920x1080), and is the biggest jump in displays since 1080p, or 2560x1440
for PC monitors. The benefit is greater pixel density for better image quality,
greater color reproduction, and better contrast. Part of the UHD (Ultra High
Definition) spectrum, it promises, similar to the switch to 1080p, to offer
more to the picture you currently get. 4K televisions have already started
rolling out, and even a couple 4K computer monitors have come out, but they are
very expensive (again, like high definition televisions relative to standard
definition TVs at their introduction).
Curved OLED is about what
it sounds like. The idea is that with very large screens, or sometimes even
with smaller ones, you want the image to wrap around your field of vision. PC
gamers achieve this by way of multiple monitors. In a home theater setting,
that means better viewing throughout the room, not just directly in front of the
screen. It’s a nice evolution of display technology. Right now it’s unseemly expensive,
meaning it will take some time before it catches on in a major way.
What will come in the
future is anyone’s guess. The truth is no one can be entirely certain. Increasing
resolution is probably one of the most directly capable things that can be
done. It’s simply stuffing more pixels in a given spot. The challenge is
accomplishing that feat in a reasonable space, at a reasonable cost. PCs will
keep getting smaller and more powerful, although I think the general public is
running out of things to do with that power. The Smartwatch emerges from the
sci-fi dream of a supercomputer on your wrist, projecting the information in
front of you as a hologram or directly to your eye (a la Google glass). For
that reason I don’t think it too farfetched that Google will make their own Smartwatch
designed to work with their Google Glass (something like packing more computing
power into the watch to relay to the glasses). The continual miniaturization of
technology will allow for the Smartwatch to go from its current form as an
ill-timed, mostly useless, gadget, to a truly useful addition to the pantheon
of consumer technology.
For desktops the number of
cores, the clock frequencies, will go higher, while the TDP, lithography, and
general size, will go lower. Hopefully we will finally get motherboards with integrated
wireless adapters. USB will likely take over more and more connectivity duties
as it gets more powerful. I don’t think Thunderbolt will see much more gains.
And of course GPUs will advance in virtual lock-step with CPUs, powering the advanced
display technologies.
We are quickly approaching
the point where it will be inconceivable for screens to get any thinner or
mobile devices to get any thinner. Holographic technology is a popular dream,
but it lacks any reasonable fact-based method of implementation, so some breakthrough
would have to develop there. Our cars will get safer, cleaner. Starting next
year we may start seeing self-driving cars (which I personally think is dull
and boring). The trends we have seen will continue onward. Anything else will
be just baseless speculation at this point. But who knows? There could be an
amazing new technology resting on someone’s drawing board, waiting to be a real
revolution in the consumer technology world.
Comments
Post a Comment