A Future Including Virtual Reality

Seeing is believing—at least, that’s how it’s supposed to be. However, seeing may not mean believing anything in the future. During the building of the PC for Build Your Own PC on a Budget, I investigated various new technologies, including virtual reality, where what you see may not exist at all. Of course, gamers are eagerly anticipating the Oculus Rift, which promises to transform gaming with a monitor into an experience where you really feel as if you’re there. This kind of technology isn’t quite available yet, but will be soon. Even when the hardware is ready and the drivers work as promised, truly immersive games will take time to create. Look for this experience to evolve over time to the point where the Holodeck featured in Star Trek actually does become a reality.

To attract attention and become viable, however, technology must answer specific needs today. It was with great interest that I read Marines test augmented reality battlefield. Unlike the Oculus Rift, this technology actually does exist today and it demonstrates some of the early uses of virtual reality that you can expect to see. In this case, the background is real—it’s an actual golf course. The virtual reality system adds the hardware of war to the scene, including tanks, mortars, and features, such as smoke. What the marine sees is a realistic battlefield that doesn’t exist anywhere but the viewer’s glasses. This is the sort of practical use of virtual reality that will continue to drive development until we get a holodeck sometime in the future.

Virtual reality for gamers and the armed services is nice, but it’s also becoming a reality for everyone else. Samsung and Facebook are introducing a virtual reality solution for movie goers. That’s right, you’ll be able to strap some glasses to your head and get transported to a comfy living room with a big screen TV where you can watch the latest movies offered by Netflix. The Gear VR device promises to change the way that people see movies forever. This particular device actually works with your smartphone, so you need a compatible smartphone to use it. In addition to movies, Gear VR also promises to let you play virtual reality game and become involved in other immersive environments. All you really need is the right app.

An immersive experience, where you eventually won’t be able to tell real from created, is what virtual reality promises. Using virtual reality, you could travel to other parts of the world, explore the ocean depths, or even saunter through the solar system as if you’re really there, but still be in your own home. Virtual reality will eventually transform all sorts of environments, including the classroom. Imagine children going to school, interacting with other students, learning from the best instructors, and never leaving their home. A student could get a top notch education for a fraction of the cost that students pay today.

Coupling virtual reality with other technologies, such as robotics, could also allow people to perform a great many unsafe tasks in perfect safety. A human could guide a robot through a virtual reality connection to perform real world tasks that would be unsafe for a human to perform alone. Think about the use of the technology in fighting fires or responding to terrible events that currently put first responders at risk. Virtual reality will eventually change the way we view the world around us and I hope that the experience is as positive as vendors are promising today. Let me know your thoughts about virtual reality at John@JohnMuellerBooks.com.

 

Considering Threats to Your Hardware

Most of the security write-ups you see online deal with software. It’s true that you’re far more likely to encounter some sort of software-based security threat than any of the hardware threats to date. However, ignoring hardware threats can be problematic. Unlike the vast majority of software threats that you can clean up, hardware threats often damage a system so that it becomes unusable. You literally have to buy a new system because repair isn’t feasible (at least, for a reasonable price).

The threats are becoming more ingenious too. Consider the USB flash drive threat called USB Killer. In this case, inserting the wrong thumb drive into your system can cause the system to completely malfunction. The attack is ingenious in that your system continues to work as normal until that final moment when it’s too late to do anything about the threat. Your system is fried by high voltage sent to it by the thumb drive. Of course, avoiding the problem means using only thumb drives that you can verify are clean. You really can’t even trust the thumb drive provided by friends because they could have obtained the thumb drive from a contaminated source. The result of such an attack is lost data, lost time, and lost hardware—potentially making the attack far more expensive than a software attack on your system.

Some of the hardware-based threats are more insidious. For example, the Rowhammer vulnerability makes it possible for someone to escalate their privileges by accessing the DRAM on your system in a specific way. The technical details aren’t quite as important as the fact that it can be done in this case because even with repairs, memory will continue to be vulnerable to attack in various ways. The problem is that memory has become so small that protections that used to work well no longer work at all. In addition, hardware vendors often use the least expensive memory available to keep prices low, rather than use higher end (and more expensive) memory.

It’s almost certain that you’ll start to see more hardware threats on the horizon because of the way in which people work with electronics today. All these new revelations remind me of the floppy disk viruses of days past. People would pass viruses back and forth by trading floppies with each other. Some of these viruses would infect the boot sector of the system hard drive, making it nearly impossible to remove. As people start using thumb drives and other removable media to exchange data in various ways, you can expect to see a resurgence of this sort of attack.

The potential for hardware-based attacks continues to increase as the computing environment becomes more and more commoditized and people’s use of devices continues to change. It’s the reason I wrote Does Your Hardware Spy On You? and the reason I’m alerting you to the potential for hardware-based attacks in this post. You need to be careful how you interact with others when exchanging bits of seemingly innocent hardware. Let me know your thoughts about hardware-based attacks at John@JohnMuellerBooks.com.

 

Thinking About the Continuing Loss of Privacy

It’s easy to wonder whether there will ever come a time when humans will no longer have any privacy of any sort. In part, the problem is one of our own making. We open ourselves up to all sorts of intrusions for the sake of using technology we really don’t need. I’ve discussed this issue in the past with posts such as Exercising Personal Privacy. As people become more addicted to technology, the thinking process is affected. The technology becomes a sort of narcotic that people feel they can’t do without. Of course, it’s quite possible to do without the technology, but the will to do so is lacking.

A couple of articles that I read recently have served to highlight the consequences of unbridled technology overuse. The first, Getting Hacked Is in Your Future, describes the trend in hacking modern technology. Of course, avoiding getting hacked is simple—just stop using the technology. For example, people have gotten along just fine without remote car starts to heat their cars. Actually, it’s simply a bad idea because the practice wastes a considerable amount of gas. The point of the article is that hackers aren’t ever going to stop. You can count on this group continuing to test technology, finding the holes, and then exploiting the holes to do something horrid.

Wearable technology is also becoming more of a problem. The ComputerWorld article, Data from wearable devices could soon land you in jail, describes how police will eventually use the devices you use to monitor yourself against you. The problem isn’t the wearable technology, but the fact that many people will use it indiscriminately. Even though logic would tell you that wearing the device just during exercise is fine, people will become addicted to wearing them all the time. It won’t be long and you’ll see people monitoring every bodily function 24 hours a day, seven days a week. The use of cameras to view static locations on a street will soon seem tame in light of the intrusions of new technologies.

A reader recently asked whether I think technology is bad based on some of my recent blog posts. Quite the contrary—I see the careful use of technology as a means of freeing people to become more productive. The problem I have is with the misuse and overuse of technology. Technology should be a tool that helps, not hinders, human development of all sorts. I see technology playing a huge role in helping people with special needs become fully productive citizens whose special need all but disappears (or possibly does disappear to the point where even the technology user doesn’t realize there is a special need any longer).

What is your take on the direction that technology is taking? Do you see technology use continuing to increase, despite the problems that it can pose? Let me know your thoughts on the good uses for technology and the means you use to decide when technology has gone too far at John@JohnMuellerBooks.com.

 

In Praise of Dual Monitors

A lot of people have claimed that the desktop system is dead—that people are only interested in using tablets and smartphones for computing. In fact, there is concern that the desktop might become a thing of the past. It’s true that my own efforts, such as HTML5 Programming with JavaScript for Dummies and CSS3 for Dummies, have started to focus on mobile development. However, I plan to continue using my desktop system when working because it’s a lot more practical and saves me considerable time. One such time saver is the use of dual monitors.

Yes, I know that some developers use more than just two monitors, but I find that two monitors work just fine. The first monitor is my work monitor—the monitor I use for actually typing code. The second monitor is my view monitor. When I run the application, the output appears on the second monitor so that I can see the result of changes I’ve made. Using two monitors lets me easily correlate the change in code to the changes in application design. Otherwise, I’d be wasting time switching between the application output and my IDE.

I also use two monitors when writing my books. The work monitor contains my word processor, while my view monitor contains the application I’m writing about. This is possibly one time when a third monitor could be helpful—one to hold the word processor, one to hold the IDE, and one to view the application output. However, in this case, a third monitor could actually slow things down because the time spent viewing the output of an example is small when compared to creating a production application.

The concept of separating work from the source of information used to perform the work isn’t new. People have used the idea for thousands of years, in fact. For example, when people employed typewriters to output printed text, the typist employed a special stand to hold the manuscript being typed. The idea of having a view of your work and then another surface to actually work on is used quite often throughout history because it’s a convenient way to perform tasks quickly. By employing dual monitors, I commonly get between a 15 percent to 33 percent increase in output, simply because I can see my work and its associated view at the same time.

Working with dual monitors not only saves time, but can also reduce errors. By typing as I view the output of applications, I can more reliably relate the text of labels and other information the application provides. The same holds true when viewing information sources found in other locations. Seeing the information as I type it is always less likely to produce errors.

Don’t get the idea that I support using dual monitors in every situation. Many consumer-oriented computer uses are served just fine with a single monitor. For example, there isn’t a good reason to use two monitors when viewing e-mail in many cases—at least, not at the consumer level (you could make a case for using dual monitors when working with e-mails and a calendar to manage tasks, for example). Dual monitors commonly see use in the business environment because people aren’t necessarily creating their own information source—the information comes from a variety of sources that the user must view in order to use reliably.

Do you see yourself using dual monitors? If you use such a setup now, how do you employ it? Let me know at John@JohnMuellerBooks.com.

 

Extending the Horizons of Computer Technology

OK, I’ll admit it—at one time I was a hardware guy. I still enjoy working with hardware from time-to-time and it’s my love of hardware that helps me see the almost infinite possibilities for extending computer technology to do all sorts of things that we can’t even envision right now. The fact that computers are simply devices for performing calculations really, really fast doesn’t actually matter. The sources of data input do matter, however. As computer technology has progressed, the number of sensor sources available to perform data input have soared. It’s the reason I recently wrote an article entitled, Tools to Help You Write Apps That Use Sensors.

The sensors you can connect to a computer today can do just about any task imaginable. You can detect everything from solar flares to microscopic animals. Sensors can hear specific sounds (such as breaking glass) and detect ranges of light that humans can’t even see. You can rely on sensors to monitor temperature extremes or the amount of liquid flowing in a pipe. In short, if you need to determine when a particular real world event has occurred, there is probably a sensor to do the job for you.

Unfortunately, working with sensors can also be difficult. You don’t just simply plug a sensor into your computer and see it work. The computer needs drivers and other software to interact with the sensor and interpret the data they provide. Given that most developer have better things to do with their time than write arcane driver code, obtaining the right tool for the job is absolutely essential. My article points out some tricks of the trade for making sensors a lot easier to deal with so that you can focus on writing applications that dazzle users, rather than write drivers they’ll never see.

As computer technology advances, the inputs and outputs that computers can handle will continue to increase. Sensors provide inputs, but the outputs will become quite interesting in the future as well. For example, sensors in your smartphone could detect that you’re having a heart attack and automatically call for help. For that matter, the smartphone might even be programmed to help in some significant way. It’s hard to know precisely how technology will change in the future because it has changed so much in just the last few years.

What sorts of sensors have you seen at work in today’s world? Do you commonly write applications that use uncommon sensor capabilities? Let me know about your user of sensors at John@JohnMuellerBooks.com. I’d really be interested to know how many people are interested in these sorts of technologies so that I know whether you’d like to see future blog posts on the topic.

 

The Pain of Current Hardware Updates

It’s no longer possible for the average person to install hardware on a system with any assurance of success and a few of us old hands are encountering problems as well! That’s my experience with a recent hardware update for my system. Yes, I got the job done, but it required more work than necessary and included several trips to the store. In one case, the store sold me the wrong part (not the part I requested) and I ended up having to go back to exchange it. One of the few significant advantages in owning a desktop system, the ability to update as needed, is being eroded by a serious deficiency in the quality of upgrade components.

When I first started building my own systems many years ago, the devices that went into the box came with beautifully rendered manuals, all the required software, and any required hardware. Of course, you could get cheaper products that didn’t quite include everything, but even in this case, the device included a getting started book and the required software. However, many people opted for the nicer vendor packages to ensure they wouldn’t have to continuously run to the store for yet another part. It was overkill in a way. For example, few people actually bothered to read the manuals end-to-end and simply used the getting started guide to get the hardware installed as quickly as possible. They’d then use the manual as a quick reference when problems occurred.

A few years ago I noted that even high end products no longer shipped with a paper manual. You received the getting started guide in paper form and could then use the manual that accompanied the DVD once you restarted the system. The devices still shipped with all the required hardware and software. Some storage devices had the software installed right on the device itself, but still, you received the required software. Even so, the new packaging technique achieved a nice balance between protecting the planet and still allowing just about anyone to perform a hardware upgrade.

You might have noted that the Monday post was missing. Well, that’s because I was offline wrestling with a hardware update that should have been quite easy. The replacement of my hard drive and display adapter should have taken only a few minutes, but ended up taking an entire day (starting Sunday afternoon) due to the lack of documentation, incomplete (but required) installation hardware, and lacking software. Today my system is running, mostly configured, and the new parts work beautifully, but the price of getting them installed was way too high.

There are a few new lessons that I’ve learned as part of this experience. The most important is to check the box to ensure you have absolutely everything before you get started. Yes, this has always been good advice, but the products of the past generally included everything needed to get the job done. Given the trend I’m seeing now, you’ll likely need screws, possibly a piece of installation hardware, cabling, and other items that are listed as optional in the documentation (even though the device won’t work without them). Check the installation hardware before you leave to the store to make sure they’re actually selling you the right part. For example, make sure the cable you buy is actually rated to handle the load you’re placing on it (a cable rated for 3 Gb/s may not work well for a device that is designed to transfer data at 6 Gb/s).

It pays to put any DVD that comes with the device into the drive on your working system and explore it before you take your system down to upgrade it. Make sure you print out any information you need for installation before you take your system offline. For example, you should print out any jumper information and cabling instructions. Once you have your system offline for the installation, it’s too late to print that information out. If you don’t have a second system to view the documentation at that point, you’ll find that installation is next to impossible.

Some devices no longer come with an installation DVD. In this case, you must go to the vendor site, download the required manual and software, and ensure you’re familiar with it before you take your system offline. Make sure the software and manual are put on removable media because you may need them before the installation process is complete.

Make sure you perform the upgrade in a manner that allows you to revert back to the pre-upgrade state when necessary. Actually, this has always been good advice, but it’s even more important now that the possibility of success is less. You may find that you have to reverse the upgrade to get a working system so that you can determine why the upgrade didn’t work.

Desktop systems have the advantage of allowing updates, but performing the update has become significantly more difficult because vendors no longer take the care in packaging products that they once did. What sorts of problems have you encountered? Let me know at John@JohnMuellerBooks.com.