Thursday, October 14, 2010

Smart TV, Video Games, Interactivity: A Look Ahead

Five years ago it was quite hard to fathom the idea of walking into a Best Buy and leaving with a 42" Plasma television for under $500. In fact, if we look a little harder we could have probably scored a better deal, but nothing like we might see another five years from now. Right now LED television is hitting the mainstream and even bigger models (up to 65" I believe) are in stores soon to be under the $2,000 mark. How big will they get? How cheap will they become?

It's hard to predict a "hard" ceiling. For projection TV's, they topped out at around the 62" range, but they had their own set of limitations. The picture became fuzzy because of the (by today's standards) absolutely dismal resolution and the sheer weight and bulk. Now that those limitations have been blown to pieces, we can get as big as manufacturers will allow. Programming in 1080p on a 104-in diagonal projector screen is crystal clear, and the even largest LED displays don't come even close to the 100-pound mark. The only real hard limitation will be doorway clearance requirements, which allow for just about anything the size of a large sofa. While a perfect solution for computers, multi-panel displays probably won't see much action for a number of reasons, unless a way to integrate multiple viewpoints from multiple cameras can be made to work.

The latest marketing trend for high-end television is the new 3D technology. While being absolutely fascinating when perfected, it's something that still has a very limited application at this point in time. We have a small number of movies and video games that utilize the technology, but the development cost and complexity of this media is still very prohibitive. 3D in all it's iterations has been a luxury and not a necessity, and that will be it's most limiting factor. But alas, it might take only one really amazing title to tip the scales, but I don't see it happening anytime in the next few years.

So I hear that some new TV's have become smart! They have built-in internet connectivity and so now they've morphed into a sort of computer. The newest generation of video game consoles also have similar abilities, and these have a number of possible implications. First, some content is now becoming available on the internet via sites such as Netflix which makes drops the necessity of having cable/satellite service down a notch. There's some folks like me who only subscribe to satellite for sports, and once that feature is fully functional on the net, I'm probably dropping the dish. Video-on-demand is still a concept foreign to older generations, but will likely be the rule within the next ten years. Unfortunately, there are some really big losers in the big game and you can bet they'll be holding on for dear life.

Video game development continues to progress, yet we haven't seen a new game console released for almost 4 years with relatively little talk about the next generation. Is it possible that processing power has jumped that far ahead of game complexity? Have some games become so realistic that any additional elements require a quantum leap in programming technology? Are the features which drive new game sales less about graphical bells and whistles and more about precision and improved AI?

Wednesday, July 14, 2010

iPads, the niche and hype

Well folks, this is it. The device that will revolutionize the way we look at computers. Or phones. Or books. Or all of them! We've seen similar devices in the past, from netbooks to e-readers to super-gigantic smartphones. What we really want to know is, where does it fit in mainstream America 2010. What I really want to know is, will this be the beginning of a paradigm shift in mobile computing technology as a whole?

I first heard about the iPad at the beginning of the year, and needless to say I was a little more than skeptical. For the past two years or so, tech companies have tried to create a market to bridge the gap between the cumbersome laptop computer (which is about 3x less cumbersome than 15 years ago, but still) and the convenient-yet-lacking-in-power smartphone. I've seen tablet PC's, ultra-lightweight laptops, and netbooks -- all of which have the same major deficiencies; weight, battery life, messy software (aka Windows), and all sorts of issues we're used to in the desktop computing world of today. The Kindle e-Reader has seen popularity, however it's capability is quite limited and display is still too small for my liking. To me, the iPad wasn't going to be able to offer anything I didn't already have, and the price tag was on the high side, to boot.

When the iPad was first launched in April, I was reading mostly negative reviews from authors who shared the same sentiments which I had. Why would somebody go out and throw $500+ on a device that can't run Flash, make phone calls, or anything else that a similarly-equipped iPod touch can? My logic, which at the time seemed completely solid, had a few flaws. Here are a few point-counterpoints which I have revised my opinion about:

-The iPad can't do half the things that a normal laptop running Windows can:
While factually true, the iPad can still do the things that matter most to the vast majority of users, and virtually all business users. It can do e-mail, calendaring, attachments, surf the web (not all web apps work properly, but this is addressed later). Furthermore, the device comes with no bloat and extra "stuff" to clog up/slow down the system either.

-Why would anyone want to read a book on a computer rather on paper?:
If you still feel this way, try reading something on an iPad or any other e-Reader for that matter. There are still some publications which you'd much rather have to page through and skim (reference books like almanacs come to mind), but I'm a huge, huge fan of reading books on the iPad. You can even change the font and font size to suit your liking. And some eBooks are absolutely free (lots of classic novels) so it really might NOT cost you more in the long run!

-It can't do Flash so it's automatically bad
Can we all agree that Flash isn't the second coming? Apple sure does, and they've decided that there's a better alternative: make individual applications that don't use Flash that are carbon copies of the ones that do! Stuff such as YouTube, Facebook, and just about anything you can think of. And if they haven't yet, they're probably planning on it.

-It has no place in the business/enterprise world
If Blackberries took the business world by storm, the iPad has the potential to be a Category 5 hurricane. No longer do business travelers need to lug around their dreadnaught laptop/power supply/accessory bundle with them everywhere they go. The iPad is light, slim, clean. The battery will probably last over the entire weekend. You'll get e-mails, and you can actually read the attachments that probably weren't going to be edited until you got back to the office anyways. Oh, and it's a handy dandy GPS too (if you have the 3G version which I highly recommend!)

-Price is waaaaaay too high:
But there's really not that much price difference between this and a mid-range laptop. This guy is highly marketed towards business professionals, who are much less likely to balk at the price tag, and in relative terms isn't all that expensive to begin with.

-Who in their right mind is going to carry a laptop, smartphone and intermediate device?:
My opinion is that the iPad doesn't fill the role which I alluded to earlier. It simply complements (and in some cases, can actually replace) a laptop computer. That said, I somehow see the stock in laptop computers falling to some degree and possibly becoming a niche device.

While I definitely feel as though Apple made a lot of good decisions regarding the iPad so far, the technology still needs to improve. I'm not here to tell everyone to throw away their laptops and buy iPads, but I do encourage everyone to give it a chance. Apple is making a concerted effort to cater to the business community and I have faith that it's strategy to integrate the iPad into the workplace will be successful.

Sunday, January 3, 2010

Virtual reality and how I see the impact

I'm guessing that the idea of virtual reality (VR) and artificial intelligence (AI) was thought up centuries (or millennia) ago. Ever since, these ideas have been represented as more feasible and believable. Now in 2010, is it finally time for virtual reality to take the next big step?

There have been countless movies and cut-scenes which portray the vision of VR in everyday life. Each of them is rendered slightly different; but the basic premise remains the same. We are presented with an alternate view of the environment on a large screen (or headset). Within this world, we are provided with additional sensory stimulation. While we haven't yet achieved the "ultimate" virtual reality experience yet, we are inching closer and closer to our desired destination.

Back in the mid-1990's with the advent of multimedia computers, we were finally offered a glimpse of full motion rendered graphics and audio which were becoming realistic enough to garner enough interest in mainstream society. As computer graphic capability and rendering advanced exponentially over the past 15 years, certain aspects are becoming realistic enough to accept as real. However, there's still a long, LONG way to go until we enter the virtual worlds conceptualized in the movies.

First of all, in comparison to real-world vision, the virtual worlds which we have been presented with still have very, very crude graphics. We see unnatural, predictable patterns in terrain. We fail to see any ground clutter, dirt or otherwise natural deviations to a pristine landscape. Pixellation is still very much a concern and huge hurdle to overcome. Graphical hardware limitations often prevent a smooth, scrolling experience due to suboptimal frame rate. And so far we've only tackled the visual side of things!

to be continued...