As hackers, malware/spyware creators, and other maliciously-intended individuals learn how to get past security software programs, new algorithms are created in an effort to stay on top of the attacks. This constant security software evolution is meant to stay on top of viruses and end them before they spread. Software intended to protect requires constant studying to determine progression. But is solely utilizing a protective software the wrong approach to take?
Some researchers believe so. Security specialists are suggesting that preventing systems from being vulnerable to spyware, adware, malware, viruses, and intrusion attempts in the first place is the correct action to take. Some subscribe to this theory for ethical reasons, believing that systems or software sold with “security holes” is really defective, and unfinished. Others believe that it’s an additional money grab on the part of the security software industry. Extremists even believe that it stems from an age-old conspiracy that suggests that software providers often leave holes open purposely, so that they can charge consumers for software to fix it. This final conspiracy can be debunked with some critical thinking and realizing that there will be evolutions in threats, and therefore, holes will be exploited that were at one point un-exploitable.
The truth behind most security software evolution processes is much less exciting; in most cases, security software is a redundant concept. Occasionally, this software can predict the patterns of new viruses that are similar or related to old viruses, but the majority are not able to detect anything until they have already encountered, studied, and deconstructed the code. It is from this code that virus and infection signatures are developed.
That means that a computer somewhere needs to be infected so that the infection itself can be studied. Comparatively, it’s the same process that is required within vaccinations for illnesses. The influenza vaccine could only be developed after the virus had been identified and understood. Software protects individuals from getting the virus in the future, but does nothing to get at the heart of the problem –computer vulnerabilities that allow malicious code to take up shop.
Thus, it leads to a continuation of adaptation on both sides, and an on-going battle for security. This Band-Aid effect may work temporarily, but it requires constant re-evaluation and massive investment; sourced from desperate consumers fearful of loss of important data and damaged machines.
Researchers suggest that three main methods can be used to control systems in a more secure manner. User management, privilege management, and application control can be very effective anti-virus strategies, especially when used as a simultaneous net.
But just how is this put into place? How is it that we can go about creating computers that are able to work smarter to get the job done safely, securely, and efficiently?
User management is a term that defines how computers handle users within the system. It is critical to security. Ineffective user management can open security holes that allow for intrusion and code insertion. It is what is used in both Windows and Linux to define permissions, user profiles, and access to files across a hard disk drive. Strictly called User Management in Linux, it is often referred to as UAC (User Account Control) in Windows Vista and above. It can provide a two-step verification process, ensuring that the user has to confirm before new programs are installed or before special access is granted. Typically, a single, top-level user exists within user management. In Windows, this is the administrator. In Linux, it’s the root command, sudo. By tightening up user management, and ensuring that guest accounts and non-administrator accounts are limited, a higher level of security can be achieved.
Privilege management is a process, rather than a unique system. It refers to the process used when managing user profiles to the recommendations set out in ITU-T Recommendation X.509. This IT recommendation serves to increase security through the use of public-key certificates and attribute certificates, each of which can be assigned to a user within the system. ITU-T Recommendation X.509 excels because it describes two separate levels of authentication; the first is ideal for everyday use, and the other is ideal for increased security. The first is considered a form of simple authentication. It uses a single password as a way to verify the user has the permissions needed to access data or systems. It is mostly used as a way to verify authentication protocols. The recommendation also outlines strong authentication; this is an increased level of security that uses credentials that have been encrypted or formed through the use of cryptographic techniques. Because simple authentication is no longer truly effective on its own, it comes highly recommended for securing systems.
As the name suggests, application control quite literally allows an administrator to define “whitelisted” applications, thereby pre-approving which applications can run. All other applications outside of the whitelist are disallowed. This is an excellent way to stop malware or viruses because of its all-or-nothing approach. Instead of verifying by the authentication or code contents, the system will simply deny access until manual access is granted. Essentially, it takes the choice away from the system, which may decide in error, and places it in the hands of the administrator. While passwords may be stolen, those without access to application control will not be able to make changes or install software in the background without seeking permission from whoever is. This is also a great choice for individuals who don’t have the computer training to make informed installation choices. It is frequently used in public computer or workplace settings to prevent accidental virus installation or malware downloads, not to mention access to games and other time wasting applications
More and more businesses are realizing that the “ounce of prevention is worth a pound of cure” way of thinking can be much cheaper and more effective than being reactive to attacks. Because of this, there is a rise in alternative security measures in the enterprise IT suite. While, anti-viruses will never go away and will always be a necessity, the new age of security technology is working on prevention as opposed to protection.
The security experts at Arellia were essential in the research and information gathered for this report.