Gartner reportedly has found that app errors may cause 75 per cent of mobile security breaches by 2017.
This should worry anybody with a smartphone – not to mention all those businesses trying to establish a viable bring-your-own-app (BYOA) policy.
Apparently, the security threat will not be from a highly technical malware attack but the misconfiguration and misuse of apps. To an IT services provider, this rings true.
Security breaches we come across are seldom caused by a sophisticated external attack. Instead, they are the result of an inside job or human error that has been overlooked.
More often than not, people expose themselves to risk to gain more privileges for the devices they use. This is certainly the case when we try to jailbreak iOS or root Android to increase our access to what's inside our mobiles, and to add apps and modifications that aren't authorised by the creators of the handsets.
This type of phone tampering is problematic for mobile security and could be one of the major causes of future security breaches.
Jail-breaking an iPhone disables the part of the operating system security architecture that minimises system crashes and limits third-party access to the apps and user data.
The same happens when rooting the Android platform. A barrier is raised, and unauthorised users can gain access and cause havoc.
Once this happens, security is no longer somebody else's responsibility but your own. It also represents a headache for IT professionals as well. If you're bringing your tampered phone to work, are you exposing your company's data unnecessarily?
Crucially, will IT have the expertise to minimise the impact of a security breach if user data is exposed?
Phone restrictions are there for a reason: to keep security tight. If increased privileges are granted, the risk to security from malware or hacking rises. When you have more privileges, so do the bad guys. This is like leaving your front door unlocked.
A solid management policy should include keeping a tab on how users interact with their apps. This is more important than ever when users are bringing their own apps, as it were, even though most organisations haven't a clue how many users are doing so.
Many IT professionals in the UK profoundly underestimate how many apps are used on personal devices at work. You might think it is just a few, but I have read estimates that there could be more like 20 per employee.
Forget all the hoopla surrounding BYOD and how to protect personal phones in the workplace. The real issue isn't the device itself but what's inside it.
Application security needs to change to take into account today's development practices, including agile methodologies such as extreme programming (XP), feature-driven development (FDD), scrum, and crystal.
All those mentioned were designed to accelerate the delivery of apps. Unfortunately, these agile applications are also usually created by very small companies or even one person alone, and are constructed mainly of components, many of which include open source code.
This leads to many development mistakes, while there are no resources for genuine root cause analysis.
Application security approaches that rely on open source code aren't that useful for companies constructing apps from third-party components. They may not have the source code, and they don't have to deal with the source.
For Agile, if the application security process takes too long, or results in many false positives, it will hinder the development process. This results in developers simply bypassing the security process.
And when developers start bypassing security, everyone is in trouble.
Richard Eglon is marketing director at Comms-care
Telco also announced series of initiatives to drive digital growth in the UK
Nana Baffour opens up on Getronics' mammoth acquisition of Pomeroy
Analyst predicts SaaS will remain the dominant segment in the market as it grows 17 per cent in 2019
NSS Labs claims vendors are refusing to have their products tested effectively and are trying to restrict its access