The organizers of the recent Static Analysis Symposium — conveniently held four blocks from my office — were kind enough to invite me to give the opening talk. Now, this is a conference where the presentations have titles like “Efficient Generation of Correctness Certificates for the Abstract Domain of Polyhedra“; I know what all those words mean individually, it’s just them next to each other in that order that I don’t understand. Fortunately for me, the SAS organizers invite people in industry to give talks about the less academic, more pragmatic aspects of program analysis, which I was happy to do.
They also let me pad my presentation with funny pictures of cats, which helped a lot.
Unfortunately I don’t have a recording of the talk, but my slides are posted here if you want to check them out.
Special thanks to Scott Meyer of BasicInstructions.net who was kind enough to allow me to use his comic about informative presentations in my informative presentation.
Please tell me that cat photo is the only reference to this, and that it wasn’t your opening.
No matter what you think of Steve as a CEO, certainly no one can accuse him of being “stuffy”. Business Cat by contrast is looking pretty conservative in that tie.
Is Coverity not oriented for personal development/hobbyist projects? I can’t find any pricing on the website.
It is not; our target market is software companies, not individual developers.
However, Coverity’s C/C++ and Java analysis is available for free for registered open source products. (C# analysis is unfortunately not offered at this time, though I hope that we can start offering that service at some point.) See http://scan.coverity.com/ for details.
Awesome. I look forward to someday using it for my C# projects.
Scott Meyer is awesome and it’s a mark of good taste that you would use one of his comics.
Hmm..when you say “I don’t have a recording” does that mean that you personally don’t have one,but one exists, or that one exists and you don’t know about it, or that one does exist and maybe someone you do know has it or, well, is there any chance ever we’ll get to see the presentation? The slide deck alone was brilliant!
Thanks, I’m glad you liked it.
I do not believe the talk was recorded at all, unfortunately.
“I know what all those words mean individually, it’s just them next to each other in that order that I don’t understand.”
This is without a doubt one of the best ways I’ve ever read to say that. I am going to quote you on this henceforth.
I don’t want you to get the impression that it’s just the number of words; getting them in the right order is just as important.
I hadn’t clicked the link earlier – I’d just assumed it was the obvious Python (Monty, Ltd.) reference. Most excellent.
Sounds like static analysis faces the same issues we as programmers face when trying to get people out of wasteful “traditional” techniques. Things are viewed as OK because they think it works, but no one cares to notice the elephant in the room.
Cat worrying about “dynamic” bugs literally killed me 🙂
As a recent purchaser of Coverity C# analysis, I say, “bring on the ‘churn’!” I would be ecstatic if our Coverity defect count jumped from 150 (with 34% false positives) to 1500 (but hopefully with a lower FP rate).
Raygun (http://raygun.io) is reporting the crashes so I know they’re there. I’d love to find them before the customers do.
Please don’t dumb the tools down by assuming that we’ll be upset if the new version works better (and reports more defects). At our company, developers buy development tools, and we don’t perversely incentivise management to keep some arbitrary reported number low. I want to fix crashes and data corruption bugs before they’re shipped to customers, and I’d love it if Coverity would help me do that.
At a company I used to work at we discussed FindBugs one day. I said it was great because for projects it had never been run on, almost 50% of the issues it reports are actual, we-should-fix-this bugs (not just discounting false positives but also reports of stuff that’s technically wrong but not that important to fix).
A coworker dismissed it because he thought that rate was not good enough to bother.
I didn’t – and still don’t – understand that attitude. It usually doesn’t take long to check an item and decide it’s a false positive or something you don’t want to spend time fixing, and if I can eliminate 10 actual bugs by looking through 20 or even 50 items seems like a good use of time.
OK, now suppose that instead of eliminating ten bugs out of twenty or fifty, you’re eliminating a thousand bugs out of five thousand reports, four thousand of which are a waste of time. What’s the cost, and how many new features could that money buy?
I see the point, but our projects were small enough that you could check (though perhaps not fix) all reports for one in an afternoon, IIRC.
Only a fraction of those bugs will actually be noticed if they are not fixed, but that event could have big costs attached (say if it occurs during a presentation, or if user data gets corrupted), and even if they don’t, tracking the bug down by its symptoms can be much more time-consuming than finding it through static analysis. I would assume that adding new features will also be easier once you eliminate those bugs.
Additionally, this is something I like to do when my current task is to familiarize myself with a project I didn’t work on yet, since you get lots of little tasks spread all over the code and so get a bit of a tour :).
I don’t have any actual data to know when the effort exceeds the benefits though. It depends a lot on the importance of the project, on the consequences of failure, and the expected lifetime of the project. Do you have a rule of thumb?