You are browsing the archive for Preaching.

How to find new persistence tricks?

October 14, 2018 in Autostart (Persistence), Personal, Preaching, Reversing

Every once in a while people ask me how do I find all this stuff.

The TL;DR; answer is simple: curiosity + reading Microsoft documentation + other peoples’ research + applying some automation.

At first, it was really just some curiosities that I could not explain when I was less experienced in reversing e.g. the Visual Basic VBA Monitors. When you use Procmon a lot, some of the stuff you see in the logs eventually gets stuck in your head and becomes really familiar. Such was the case with the HKLM\SOFTWARE\Microsoft\VBA\Monitors key that I saw anytime I was analysing a VB application with Procmon. I could not explain it and was curious what it is for…. googling around didn’t bring any answers. Eventually I started analysing the actual code that triggers that behavior and that’s how Beyond good ol’ Run key, Part 6 was born…

Then there is obviously a number of them that was a result of manual, often annoyingly time-consuming code analysis. There were times where I couldn’t find anything new for a few months. Perhaps assumptions were wrong; perhaps we have already discovered it all… at least so I thought every once in a while…. But… then… they keep coming… not only from me, but also from others… And it’s hard to explain how it is even possible… For instance, the recent one is a perfect example of a situation where the random luck played a role a lot. While looking at some unrelated stuff inside the kernel32.dll I happened to spot the bit that was loading the callback DLLs. With so many people looking at kernel32.dll over the years I still find it amazing we find new stuff there all the time.

Many other cases were a result of a more deliberate research; for instance, many persistence mechanisms I described rely on the fact that some programs or components load a number of DLLs that are executed one by one after they are listed under a certain location in the Registry. Such activity needs to rely on Registry enumeration APIs. If you can find programs or DLLs that use these functions you will most likely find possible persistence mechanisms!

And then there are keywords e.g. ‘providers’, a very popular way to name a place in the Registry where a lot of plug-ins are loaded from. Example of possible enumerations for some keys that include the keyword ‘providers’ is shown below:

  • SYSTEM\CurrentControlSet\Control\Cryptography\Providers
  • System\CurrentControlSet\Control\SecurityProviders\SSI\Providers
  • SYSTEM\CurrentControlSet\Services\LanmanServer\ShareProviders
  • System\CurrentControlSet\Services\RemoteAccess\Accounting\Providers
  • System\CurrentControlSet\Services\RemoteAccess\Authentication\Providers
  • SYSTEM\CurrentControlSet\Services\W32Time\TimeProviders
  • SYSTEM\CurrentControlSet\Services\WbioSrvc\Service Providers
  • SYSTEM\CurrentControlSet\Services\Winsock\Setup Migration\Providers
  • System\CurrentControlSet\Services\WinTrust\TrustProviders
  • System\CurrentControlSet\Services\WlanSvc\Parameters\ComInterfaceProviders
  • System\CurrentControlSet\Services\WlanSvc\Parameters\VendorSpecificIEProviders

I also mentioned Microsoft Documentation; it’s like a RFC for Windows programming. I have read a lot of it over the years, and every once in a while some of that old knowledge comes back to me. Ideas for tricks around DDE, WM_HTML_GETOBJECT  as well as the Propagate trick (SetProp) are result of my experience actually coding for Windows for more than 10 years. These (especially old, legacy) things stay with you and sometimes bring some really refreshing ideas. Not only for persistence tricks.

Then there are ‘magic’ APIs… if you read code and see references to ShellExecute, WinExec, CreateProcess, LoadLibrary, CoCreateInstance and their numerous variations and wrappers you will soon discover that the Windows ecosystem hardly re-uses code; or, more precisely, it does re-use a lot of it, but it also relies on lots of custom paths that are added to it. Lots of code snippets you come across look like a custom programming endeavor of the coder who wrote that part of the program just to test an idea. It’s actually a normal, even expected behavior in such a sea of code. But… quite frankly…. we really have to thank Microsoft Programmers for all the testing & debugging code and error messages/strings that are shipped with the OS. This helps a lot!

All of these unexpected and probably meant-private/for lab-only code paths provide a lot of interesting opportunities… both for persistence, and LOLBINs; anyone who just dares to look for it will eventually find something.

I am fascinated by it; the actual persistence bit is less important, even if on occasion the ‘novelty’ of some of these techniques may have the ‘wow’ factor ; the real pleasure for me is derived from these three things:

a) an opportunity to read lots of other peoples’ code and sharpen my reverse engineering skills

b) learn how the system works under the hood

c) being ahead of a curve with regards to forensic analysis

Actually, the a) and b) are equivalent… the c) is an obvious bit.

If you think of the books like Windows Internals, or The Art of Memory Forensics, the majority of the information that the authors rely on is a result of direct or indirect contact with the actual system internals (and these guys did it a lot). There is no magic wand. Yes, there are source leaks, there are ex-MS programmers becoming researchers who had an access to the source at some stage and for some time can leverage their privileged position, but I’d say that majority of the discoveries presented at conferences over last 30 years, as well in books and written on the blogs is relying on the work of all these poor reversing souls sitting and digging in the OS code all the time. Some of them even become famous and get hired by Microsoft :).

Many developers curse unpredictable behavior of some APIs, complain about the way things work, yet often are unable to pinpoint the exact reason for a certain behavior so that the root case can be analyzed. In my eyes, an ability to dig into code of others, whether the source is available or not, is the core skill of any programmer, and… perhaps even information security professional. None of the reversing, forensic, vulnerability research tools would exist w/o this ‘poke around in other people’s code’ branch.

So… if you want to find new persistence tricks… pick up any code you think has a potential, start digging, and actually discover how things work under the hood. Or at least 0,000001% of it. And no, whatever you find, you don’t need to blog about these new persistence discoveries at all – get out of my lawn! 😉

Creolisation, Tergiversation and Equivocation of IR language

July 20, 2018 in Off-topic, Preaching

There is a lot of fun made of marketing language of infosec. Anyone who is a bit technical knows that it’s a snake oil game that aims at selling at all cost, and the cyber terms coined by the marketing gurus make us all shake our heads (cyber pathogens, cyber Armageddon, cyber Pearl Harbor, cyber 9/11, etc.).

For a change, I’d like to talk about the language of the people working in IR. I find it quite interesting and actually struggle a lot with adapting to use certain terms as they sound quite foreign to me, if not pretentious.

Newcomers entering this field don’t have an easy life, at least from a linguistical perspective. The field is relatively new, many people still enter it by chance, or thanks to their background from their past work in various ‘related’ disciplines: law enforcement, digital forensics, audits, fraud analysis, network engineering, system architecture, reverse engineering, malware analysis, intelligence services, helpdesk, as well as completely unrelated: chemistry, biology, medicine, music, and many other disciplines. They bring their habits, language, points of view, and attitude which I think make an impact on the IR lingo: one that resembles a pompous creole language of sort.

Many people who came to IR with Digital Forensics experience tend to be very cautious and make lots of statements that are very much aligned with the legal responsibility they encountered as forensic experts testifying in courts. They bring tones of words and statements that often may feel like weasel words to technical people who never experienced the harsh scrutiny witnesses face in court. Hence, we start saying ‘allegedly’, ‘probably’, ‘it would seem’, ‘evidence suggests’, ‘I believe’, etc. more often than in the past. Everything is possible, but… everything is also uncertain.

The non-technical individuals with a background in military, intelligence brought us the very large corpora of terms that even a few years no one in infosec heard of. There are no more ‘bad guys’, ‘virus writers’, and ‘hackers’. Now we all talk about ‘actors’, ‘adversaries’, ‘intel’, ‘TTPs’, ‘indicators’, ‘HUMINT’, ‘SIGINT’, etc. and since we entered the geopolitics we also have ‘attribution’, ‘nation state actors’, plus ‘red teams’, and ‘blue teams’. And let’s not forget to mention the popular units ‘8200’ or ‘61398’. Oh, and we totally ‘nuke’ things.

Let’s admit it. Compliance guys came up with a lot of good ideas. While many technical people don’t like compliance, or auditors, and they perceive these ‘checkbox activities’ as the core ignorance of this industry, it is really important to highlight that these compliance frameworks do impact organizations in a very positive way. They bring structure, force orgs to create processes introducing accountability, affect the architecture, and change the way they do business. As for the language, we all now know about ‘confidentiality’, ‘integrity’, and ‘availability’, don’t we? We also know about ‘business resilience’, or ‘disaster recovery’. And lo, and behold – we even started thinking more about the business we protect than just looking at the technical aspects of attacks and just eyeballing the blinkenlights. While being a ‘cost center’ it is important to have a bit of a thought about the ‘customer’, and where the monies come from. And in my experience the last bit appears in conversations far more often now than say 10 years ago (in technical circles). Then we have ‘findings’, ‘RFIs’, ‘risk scores’, ‘risk posture’, ‘risk management’, and ‘data in transit’, ‘data at rest’, and lo and behold… ‘security controls’, and ‘acceptable use policy violations’. POS malware brought also a lot of opportunities to discuss ‘magnetic stripe’, ‘track data’, and ATMs. IR is becoming compliance on so many fronts!

Then we have network engineers; even today we can come across guys who use a bit archaic terms like ‘octets’ for bytes being transmitted in packets. You probably rarely hear of datagrams, but you definitely hear ‘egress’, ‘ingress’, ‘routing’ all the time. Many younger people find these concepts a bit unclear as in 2018 we all tend to think of uploading / downloading, or sending / receiving data, because … well… that’s how internet works today (in general, I think the mindset of many people entering the IR now is on a much higher level of the OSI model than say… in 2000).

Scientific language brought us ‘viruses’ or ‘samples’ of course, but there are now also ‘implants’, ‘payloads’, ‘detonation’, and ‘anomalies’, ‘regression’, ‘machine learning’, ‘clustering’, and ‘graphs’. And then the whole gallery of code names borrowed from the animal kingdom (‘pandas’, ‘bears’, ‘kittens’, ‘tigers’, etc.). We do ‘Proof Of Concepts’, in the ‘labs’, and we work our ideas starting with ‘hypothesis’. And as for the medicine… some time in 2017 there was a Twitter question about the tech terms that have their roots in medicine. I, among others, contributed quite a few answers to that thread. I thought it will be nice to just drop a superset of IR-related terms here:

abort, agent, anatomy (of a virus), anomaly, antiviral, assessment, attack, backbone, backtracking, bacteria, blackout, blue pill, buffer, cell, census, channel, check-up, clone, compress, congestion, contagion, containment, contamination, defect, defense, diagnose, diagnostics, disc, disease, disinfect, dissection, dissemination, DNA, downstream, epidemics, eradication, exercise, extract, gene, genetic, heartbeat, host, hub, hygiene, immune, immunize, implant, indicator, infection, infestation, influenza, inject, injection, inoculation, isolation, lab, life-support, malignant, microb, monitoring, mutation, nematode, outbreak, patch, pathogen, pathology, patient 0, pattern, penetration, post mortem, probe, prophylactics, quarantine, recovery, red pill, remedies, replication, retrovirus, sample, sanitization, scanning, screen, segment, spread, stat, stop the bleeding, strain (as in malware strain), stress test, subject, system health, tag, test, transmission, trauma, triage, USB condom, vaccine, vector (as in attack vector), virus, vitals, vulnerabilities, worm, x-rays (type of malware scanning), zombie

And last, but not least – let’s not forget about the ‘centrifuges’. Who in infosec would ever imagine talking about stuff like this 10 years ago… ???

Despite all the efforts to stay technical and binary, it would seem that we are more and more vague, indecisive, perhaps way over our heads. We are accidentally ‘jacks of all trades’ in our roles that are dealing with more ambiguity, uncertainty and pure ignorance (our own!**) that needs quick and urgent fixing all the time (**not a fault, just we don’t know everything and we always find something new to learn) than any other IT position.

We are cyber-warriors, cyber-ninjas, white hats, busticati, evangelists, thought leaders, and even celebrity CISOs. But perhaps also, and often without any bad intent, just very lucky career-oriented, fad-driven, over-entitled imposters and… kinda infosec bots. I am confident in my belief that we should wait for more evidence to support my hypothesis, and until then, let’s tentatively agree that IR is an art, and if we lived in ancient Greece, there would be totally a dedicated muse for that.