DEFCON 15: How I Learned to Stop Fuzzing and Find More Bugs
Speaker: Jacob West Manager, Security Research Group, Fortify Software
Fuzzing and other runtime testing techniques are great at finding certain kinds of bugs. The trick is, effective fuzzing requires a lot of customization. The fuzzer needs to understand the protocol being spoken, anticipate the kinds of things that could go wrong in the program, and have some way to judge whether or not the program has gone into a tailspin. Get this setup wrong, and you end up fuzzing the wrong thing, exercising and re-exercising trivial paths through the program, or just plain missing bugs (as Microsoft did with the .ANI cursor vulnerability). Fuzzing effectively takes a lot of customization and a lot of time.
Proponents of fuzzing often avoid static analysis, citing irrelevant results and false positives as key pain points. But is there a more effective way to channel the energy required for good fuzzing in order to find more bugs faster? This presentation will propose a series of techniques for customizing static, rather than dynamic, tools that will let you find more and better-quality bugs than you ever thought possible.
We compare static and dynamic approaches to testing and look at:
- The fundamental problems involved in fuzzing
- Why static analysis is harder for humans to think about than fuzzing
- Interfaces for customizing static analysis tools
- The kinds of bugs static analysis is good at finding
- Why static analysis is both faster and more thorough then fuzzing
- Where static analysis tools break down
The talk concludes with the results of an experiment we conducted on open-source code to compare the effectiveness of fuzzing and static analysis at finding a known-set of security bugs.
For more information visit: http://bit.ly/defcon15_information
To download the video visit: http://bit.ly/defcon15_videos
Видео DEFCON 15: How I Learned to Stop Fuzzing and Find More Bugs канала Christiaan008
Fuzzing and other runtime testing techniques are great at finding certain kinds of bugs. The trick is, effective fuzzing requires a lot of customization. The fuzzer needs to understand the protocol being spoken, anticipate the kinds of things that could go wrong in the program, and have some way to judge whether or not the program has gone into a tailspin. Get this setup wrong, and you end up fuzzing the wrong thing, exercising and re-exercising trivial paths through the program, or just plain missing bugs (as Microsoft did with the .ANI cursor vulnerability). Fuzzing effectively takes a lot of customization and a lot of time.
Proponents of fuzzing often avoid static analysis, citing irrelevant results and false positives as key pain points. But is there a more effective way to channel the energy required for good fuzzing in order to find more bugs faster? This presentation will propose a series of techniques for customizing static, rather than dynamic, tools that will let you find more and better-quality bugs than you ever thought possible.
We compare static and dynamic approaches to testing and look at:
- The fundamental problems involved in fuzzing
- Why static analysis is harder for humans to think about than fuzzing
- Interfaces for customizing static analysis tools
- The kinds of bugs static analysis is good at finding
- Why static analysis is both faster and more thorough then fuzzing
- Where static analysis tools break down
The talk concludes with the results of an experiment we conducted on open-source code to compare the effectiveness of fuzzing and static analysis at finding a known-set of security bugs.
For more information visit: http://bit.ly/defcon15_information
To download the video visit: http://bit.ly/defcon15_videos
Видео DEFCON 15: How I Learned to Stop Fuzzing and Find More Bugs канала Christiaan008
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
![DEF CON 24 - How to Do it Wrong: Smartphone Antivirus and Security Applications Under Fire](https://i.ytimg.com/vi/gOSogzEsHWQ/default.jpg)
![DEFCON 20: The Art Of The Con](https://i.ytimg.com/vi/4K6FXe5XcHs/default.jpg)
![The Memory Sinkhole - Unleashing An X86 Design Flaw Allowing Universal Privilege Escalation](https://i.ytimg.com/vi/lR0nh-TdpVg/default.jpg)
![How Smartcard Payment Systems Fail](https://i.ytimg.com/vi/ET0MFkRorbo/default.jpg)
![BSidesSF 113 Fuzz Smarter Not Harder An afl fuzz Primer Craig Young](https://i.ytimg.com/vi/29RbO5bftwo/default.jpg)
![](https://i.ytimg.com/vi/-ttHJvNTMvw/default.jpg)
![2015 - Static Analysis Security Testing for Dummies… and You](https://i.ytimg.com/vi/QTVxASPP2LA/default.jpg)
![36C3 - No source, no problem! High speed binary fuzzing](https://i.ytimg.com/vi/ysZ9w3PcYVU/default.jpg)
![How Telephone Phreaking Worked](https://i.ytimg.com/vi/4tHyZdtXULw/default.jpg)
![Evaluating Fuzz Testing](https://i.ytimg.com/vi/ID8XtoMn43I/default.jpg)
![34C3 - Implementing an LLVM based Dynamic Binary Instrumentation framework](https://i.ytimg.com/vi/Zt74lOuU6zc/default.jpg)
![Track 304 Advanced Social Engineering and OSINT for Penetration Testing Joe Gray](https://i.ytimg.com/vi/V6vqbDQOncY/default.jpg)
![OWASP AppSec EU 2013: How mXSS attacks change everything we believed to know so far](https://i.ytimg.com/vi/Haum9UpIQzU/default.jpg)
![#HITB2017AMS D1T1 - Harnessing Intel Processor Trace On Windows For Vuln Discovery - Richard Johnson](https://i.ytimg.com/vi/r8lzui24Cdw/default.jpg)
![Life of an Exploit: Fuzzing PDFCrack with AFL for 0days](https://i.ytimg.com/vi/8VLNPIIgKbQ/default.jpg)
![Writing Bad @$$ Malware For OS X](https://i.ytimg.com/vi/fv4l9yAL2sU/default.jpg)
![Syzbot and the Tale of Thousand Kernel Bugs - Dmitry Vyukov, Google](https://i.ytimg.com/vi/qrBVXxZDVQY/default.jpg)
![DEF CON 18 - Moxie Marlinspike - Changing Threats To Privacy: From TIA to Google](https://i.ytimg.com/vi/DoeNbZlxfUM/default.jpg)
![SirenJack: Cracking a 'Secure' Emergency Warning Siren System](https://i.ytimg.com/vi/49KoUmiJuts/default.jpg)
![DEFCON 19: Ruling The Nightlife Between Shutdown And Boot With Pxesploit (w speaker)](https://i.ytimg.com/vi/AnNi9Ng1phE/default.jpg)