Tricking the robot guards
Unpredictability is something most software developers don't want to see in their creations. Gotta hate when that program crashes right after it was working "just fine". I'm sure events give GUI developers headaches and hardware interrupts give kernel developers horrible, horrible nightmares. Most of us work striving for quite the opposite: total predictability and reliability. And I am happy with that.
Yet, when we deal with the unknown, we should always expect the worst. We never know what's on the other side of the interface. Even if we do, we can't assume it will remain the same over time. We have to deal with unpredictable behavior as best as we can, even if it has to come down to showing a miserable "I give up" message to the user.
Still, it's ridiculous to expect programs to be prepared for all kinds of random behavior, and this is something that probably hasn't been exploited enough. I'm no expert in malicious code, but I have a brain, so here it goes: I think malware could take advantage of random behavior to further infiltrate a system and conceal its existence.
As I said, I'm no malware writer. Quite frankly I despise people who waste their time on such things; software writers shouldn't have to spend half their time building walls and electric fences around their code. But I digress. My point is that I'm not really sure how could random behavior could be exploited to obtain easier access to the system. I'll focus on the second part: concealing presence.
There are tools designed to detect malware. Since malware comes in different flavors, these tools tend to use a wide variety of detection techniques to cover (ideally) all types. Some look for specific files, some look for certain running processes, some look for differences in files.
Now, certain malware programs will replace dlls or executables in order to introduce their harmful behavior into the system. Malware authors, in all their brightness, don't look two steps ahead and simply make their programs get rid of the original files, introducing their own. This makes it easy for them to be detected because abnormal behavior can be detected simply by executing the code in a controlled environment.
So here's what I think: malware that changes some interface or binary could randomly switch between wrong behavior and normal behavior, making detection a little harder. There's some probability that the malware will not be detected using that technique because the normal behavior could be obtained. I'm pretty sure the robot guards are not programmed for unpredictability. Malware detectors follow predefined detection lists and techniques, and are unlikely to repetitively perform the same test on the same component within a short span of time.
So this is a heads up for the people that work fighting malware: think unpredictable. If you only test by execution, do so more than once. Perhaps you already considered this, but I think it was worth mentioning.
Also, unpredictability works both ways. If I have to break into a place, I'll take a Hollywood style laser grid over a guard dog any day. Improvisation and randomness are very powerful tools, and changing the rules of the game is something trespassers are never prepared for. I don't have any specific ideas except maybe file relocation or security hole simulation, but some clever ideas could come out of this line of thinking.
Security seems to be a topic I keep coming back to. I don't know why I think so much about it. It's not an area I'm very interested in, but past jobs and future posts seem to revolve around the topic. I'm not a specialist in... well, anything, but I hope this can help in some way. This is to the people who spend their day looking for ways of making our computers safer. Keep up the good work.
Yet, when we deal with the unknown, we should always expect the worst. We never know what's on the other side of the interface. Even if we do, we can't assume it will remain the same over time. We have to deal with unpredictable behavior as best as we can, even if it has to come down to showing a miserable "I give up" message to the user.
Still, it's ridiculous to expect programs to be prepared for all kinds of random behavior, and this is something that probably hasn't been exploited enough. I'm no expert in malicious code, but I have a brain, so here it goes: I think malware could take advantage of random behavior to further infiltrate a system and conceal its existence.
As I said, I'm no malware writer. Quite frankly I despise people who waste their time on such things; software writers shouldn't have to spend half their time building walls and electric fences around their code. But I digress. My point is that I'm not really sure how could random behavior could be exploited to obtain easier access to the system. I'll focus on the second part: concealing presence.
There are tools designed to detect malware. Since malware comes in different flavors, these tools tend to use a wide variety of detection techniques to cover (ideally) all types. Some look for specific files, some look for certain running processes, some look for differences in files.
Now, certain malware programs will replace dlls or executables in order to introduce their harmful behavior into the system. Malware authors, in all their brightness, don't look two steps ahead and simply make their programs get rid of the original files, introducing their own. This makes it easy for them to be detected because abnormal behavior can be detected simply by executing the code in a controlled environment.
So here's what I think: malware that changes some interface or binary could randomly switch between wrong behavior and normal behavior, making detection a little harder. There's some probability that the malware will not be detected using that technique because the normal behavior could be obtained. I'm pretty sure the robot guards are not programmed for unpredictability. Malware detectors follow predefined detection lists and techniques, and are unlikely to repetitively perform the same test on the same component within a short span of time.
So this is a heads up for the people that work fighting malware: think unpredictable. If you only test by execution, do so more than once. Perhaps you already considered this, but I think it was worth mentioning.
Also, unpredictability works both ways. If I have to break into a place, I'll take a Hollywood style laser grid over a guard dog any day. Improvisation and randomness are very powerful tools, and changing the rules of the game is something trespassers are never prepared for. I don't have any specific ideas except maybe file relocation or security hole simulation, but some clever ideas could come out of this line of thinking.
Security seems to be a topic I keep coming back to. I don't know why I think so much about it. It's not an area I'm very interested in, but past jobs and future posts seem to revolve around the topic. I'm not a specialist in... well, anything, but I hope this can help in some way. This is to the people who spend their day looking for ways of making our computers safer. Keep up the good work.
1 Comments:
Certains malware removers and antiviruses work differently. They classify the behavior eg. If try logging into a user's account using wrong passwords all of which are in alphabetical order then something is definitely wrong... similar rules are defined for antiviruses and malware detector also.
The problem occurs when there is a security flaw overlooked by the developers. Eg I know that I cant make a hardlink to a directory so I, as a developer wont declare rules taking this thing into consideration. But such thing if existant can be utilised by a malicious user. This is not a softwares' random behavior.
By Sridhar Iyer, at 1/09/2006 2:47 PM
Post a Comment
<< Home