Is Bring Your Own Device (BYOD) Going Away?

The Bring Your Own Device (BYOD) phenomena has gone on for a number of years now, but no one really knows for certain how it impacts organizations today. If you read surveys, you might get the idea that BYOD is either exploding or fading. The surveys that readers of Security for Web Developers, HTML5 Programming with JavaScript for Dummies, and CSS3 for Dummies are most likely to read say that BYOD is fading. The problem with those surveys is that they’re taken by IT professionals in large organizations that have an official policy of not allowing the device. Disallowing BYOD doesn’t mean that users actually follow the policy.

In reading other articles, you might be the idea that BYOD is actually exploding. The problem with these articles is that they’re based on supposition, not fact. There is no data to back up the claim that BYOD is becoming more prevalent in the workplace. Therein lies the problem. The only official surveys talk to IT personnel on the record and not to users off the record. No one would admit to using a disallowed device—potentially throwing their job away over the purity of information.

Human nature being what it is, my feeling is that people are probably employing BYOD when they feel they can get away with it. After all, why use multiple devices to perform work when a single device does it all? Users don’t care about hardware, software, data, or anything else for that matter. They care about getting their work done, getting off on time, and getting paid—end of question. Consequently, it makes sense that if users feel that it’s possible to get by using a single device to do everything, they’ll do so. However, I have absolutely no data to back this feeling up and you have to accept my claim for what it is—a feeling.

Something that I’ve been emphasizing in my books is this idea of risk. In order to create applications that work well, yet protect organizational assets, it’s important to assess the risk of every policy and every action. Being overly cautious means that applications will work slowly, lack features, and possibly crash a lot. Users don’t like cautious applications and won’t use them if at all possible. Opening the flood gates is a bad idea too. Yes, the application will run quickly and allow a user to do just about anything, but the user won’t thank you for having to stay extra hours at work to fix problems created by an application that loses data or causes other problems because it doesn’t provide an acceptable level of risk avoidance.

No matter what survey you look at, BYOD is still a presence in the workplace, so you need to write applications in such a manner that they deal with the risks presented by BYOD in a reasonable manner. What this means is checking every bit of data you receive from anywhere for potential risks, but not unnecessarily hobbling the user with policies that really won’t mitigate any risk. Let me know your thoughts on the effects of BYOD in your organization and the actual level of BYOD use at


Application Development and BYOD

I read an article a while ago in InforWorld entitled, “The unintended consequences of forced BYOD.” The Bring Your Own Device (BYOD) phenomenon will only gain in strength because more people are using their mobile devices for everything they do and corporations are continually looking for ways to improve the bottom line. The push from both sides ensures that BYOD will become a reality. The article made me think quite hard about how developers who work in the BYOD environment will face new challenges that developers haven’t even had to consider in the past.

Of course, developers have always had to consider security. Trying to maintain a secure environment has always been a problem. The only truly secure application is one that has no connectivity to anything, including the user. Obviously, none of the applications out there are truly secure—the developer has always had to settle for something less than the ideal situation. At least devices in the past were firmly under IT control, but not with BYOD. Now the developer has to face the fact that the application will run on just about any device, anywhere, at any time, and in any environment. A user could be working on company secrets with a competitor looking right at the screen. Worse, how will developers legal requirements such as the Health Insurance Portability and Accountability Act (HIPAA)? Is the user now considered an independent vendor or is the company still on the hook for maintaining a secure environment? The legal system has yet to address these sorts of questions, but it will have to do so soon because you can expect that your doctor (and other health professionals) will use a mobile device to enter information as well.

Developers will also have to get used to working with new tools and techniques. Desktop development has meant working with tools designed for a specific platform. A developer would use something like C# to create a desktop application meant for use on any platform that supports the .NET Framework, which mainly meant working with Windows unless the company also decided to support .NET Framework alternatives such as Mono (an open source version of the .NET Framework). Modern applications will very likely need to work on any platform, which means writing server-based applications, browser-based applications, or a combination of the two in order to ensure the maximum number of people possible can interact with the application. The developer will have to get used to the idea that there is no way to test absolutely every platform that will use the application because the next platform hasn’t been delivered yet.

Speed also becomes a problem for developers. When working with a PC or laptop, a developer can rely on the client having a certain level of functionality. Now the application needs to work equally well with a smartphone that may not have enough processing power to do much. In order to ensure the application works acceptably, the developer needs to consider using browser-based programming techniques that will work equally well on every device, no matter what level of power the device possesses.

Some in industry have begun advocating that BYOD should also include Bring Your Own Software (BYOS). This would mean creating an environment where developers would make data available through something like a Web service that could be accessed by any sort of device using any capable piece of software. However, the details of such a setup have yet to be worked out, much less implemented. The interface would have to be nearly automatic with regard to connectivity. The browser-based application could do this, but only if the organization could at least ensure that everyone would be required to use a browser that met minimum standards.

My current books, HTML5 Programming with JavaScript for Dummies and CSS3 for Dummies both address the needs of developers who are looking to move from the desktop into the browser-based world of applications that work anywhere, any time. Let me know your thoughts about BYOD and BYOS at


Considering the Increasing Need for Security

Many of the readers I work with have noted an increase in the amount of security information I provide in my books. For example, instead of being limited to a specific section of the book, books such as Microsoft ADO.NET Entity Framework Step by Step (the new name for Entity Framework Development Step by Step) and HTML5 Programming with JavaScript for Dummies provide security suggestions and solutions throughout the book. The fact of the matter is that this additional security information is necessary.

There are a number of factors that have changed the development environment and the way you design applications. The most significant of these factors is the whole Bring Your Own Device (BYOD) phenomenon. Users bring devices from home and simply expect them to work. They don’t want to hear that their favorite device, no matter how obscure or unpopular, won’t work with your application. Because these devices aren’t under the IT department’s control, are completely unsecured, and could be loaded with all sorts of nasty software, you have to assume that your application is always under attack.

Years of trying to convince users to adopt safer computing practices has also convinced me that users are completely unconcerned about security, even when a lack of security damages data. All the user knows is that the application is supposed to work whenever called upon to do so. It’s someone else’s responsibility to ensure that application data remains safe and that the application continues to function no matter how poorly treated by the user (through ignorance or irresponsible behavior is beside the point). Because of this revelation of human behavior, it has become more important to include additional security discussions in my book. If the developers and administrators are going to be held responsible for the user’s actions, at least I can try to arm them with good information.

The decentralized nature of the security information is also a change. Yes, many of my books will still include a specific security chapter. However, after getting a lot of input from readers, it has become apparent that most readers aren’t looking in the security-specific chapter for information. It’s easier and better if much of the security information appears with the programming or administration techniques that the reader is reviewing at any given time. As a consequence, some of my books will contain a great deal of security information but won’t even have a chapter devoted to security issues.

I’m constantly looking for new ways to make your reading experience better. Of course, that means getting as much input as I can from you and also discussing these issues on my blog. If you have any ideas on ways that I can better present security issues to you, let me know at