reliability.html 6.2 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159
  1. <!DOCTYPE HTML>
  2. <html lang="en">
  3. <head lang="en">
  4. <meta http-equiv="Content-Type" content="text/html; charset=utf-8">
  5. <meta name="Author" content="Eric S. Raymond">
  6. <meta name="Description" content="gpsd is a utility that can listen to a GPS or AIS receiver and re-publish the positional data in a simpler format.">
  7. <meta name="Keywords" content="GPS, translator, GIS">
  8. <meta name="Revised" content="9 April 2015">
  9. <meta name="robots" content="index,follow">
  10. <!-- the following is a verification tag for Google Analytics -->
  11. <meta name="verify-v1" content="kb4f3qxJPMaOltYTpJHnvbnteza1PHO4EhrRIfcq2gk=">
  12. <title>GPSd &mdash; Put your GPS on the net!</title>
  13. <link rel="stylesheet" href="main.css" type="text/css">
  14. </head>
  15. <body>
  16. <div id="Header">
  17. How We Engineer For High Reliability.
  18. </div>
  19. <div id="Menu">
  20. <img src="gpsd-logo-small.png" alt="Small gpsd Logo" height="126"
  21. width="105"><br>
  22. Home<br>
  23. <div>
  24. <a href="#news">News</a><br>
  25. <a href="#downloads">Downloads</a><br>
  26. <a href="#mailing-lists">Mailing lists</a><br>
  27. <a href="#documentation">Documentation</a><br>
  28. <a href="faq.html">FAQ</a><br>
  29. <a href="xgps-sample.html">Screenshots</a><br>
  30. <a href="#recipes">Recipes</a><br>
  31. <a href="#others">Other GPSDs</a><br>
  32. <a href="hardware.html">Hardware</a><br>
  33. <a href="for-vendors.html">For GPS Vendors</a><br>
  34. <a href="wishlist.html">Wish List</a><br>
  35. <a href="hall-of-shame.html">Hall of Shame</a><br>
  36. <a href="troubleshooting.html">Troubleshooting Guide</a><br>
  37. <a href="hacking.html">Hacker's Guide</a><br>
  38. <a href="references.html">References</a><br>
  39. <a href="protocol-transition.html">Application Compatibility</a>
  40. <a href="history.html">History</a><br>
  41. <a href="future.html">Future</a><br>
  42. </div>
  43. <div>&nbsp;</div>
  44. <a href='http://www.catb.org/hacker-emblem/'><img
  45. src='glider.png' alt='hacker emblem' height="55" width="55"></a><br>
  46. <script src="https://www.openhub.net/p/3944/widgets/project_thin_badge.js"></script>
  47. <hr>
  48. <script><!--
  49. google_ad_client = "pub-1458586455084261";
  50. google_ad_width = 160;
  51. google_ad_height = 600;
  52. google_ad_format = "160x600_as";
  53. google_ad_type = "text";
  54. google_ad_channel = "";
  55. //--></script>
  56. <script src="https://pagead2.googlesyndication.com/pagead/show_ads.js">
  57. </script>
  58. <hr>
  59. <a href="https://validator.w3.org/check/referer"><img
  60. src="https://www.w3.org/Icons/valid-html401"
  61. alt="Valid HTML 4.01!" height="31" width="88"></a>
  62. </div>
  63. <div id="Content">
  64. <p>GPSD has an exceptionally low defect rate. Our first <a
  65. href='http://coverity.com'>Coverity</a> scan, in March 2007, turned up
  66. only 2 errors in over 22KLOC; our second, in May 2012, turned up just
  67. 13 errors in 72KLOC, all on rarely-used code paths. Though the
  68. software is very widely deployed on multiple platforms, we often go
  69. for months between new tracker bugs.</p>
  70. <p>Here's how that's done:</p>
  71. <h2>We have an extensive suite of unit tests and regression tests</h2>
  72. <p>GPSD has around 100 unit tests and regression tests, including sample
  73. device output for almost every sensor type we support. We've put a lot of
  74. effort into making the tests easy and fast to run so they can be run
  75. often. This makes it actively difficult for random code changes to
  76. break our device drivers without somebody noticing.</p>
  77. <p>Which isn't to say those drivers can't be wrong, just that the ways
  78. they can be wrong are constrained to be through either:</p>
  79. <ol>
  80. <li> a protocol-spec-level misunderstanding of what the driver is
  81. supposed to be doing, or </li>
  82. <li> an implementation bug somewhere in the program's
  83. state space that is obscure and difficult to reach. </li>
  84. </ol>
  85. <p>Our first Coverity run only turned up two driver bugs - static
  86. buffer overruns in methods for changing the device's reporting
  87. protocol and line speed that escaped notice because they can't be
  88. checked in our test harnesses but only on a live device.</p>
  89. <p>This is also why Coverity didn't find defects on commonly-used code
  90. paths. If there'd been any, the regression tests probably would have
  91. smashed them out long ago. A great deal of boring, grubby, finicky
  92. work went into getting our test framework in shape, but it has paid
  93. off hugely.</p>
  94. <h2>We use every fault scanner we can lay our hands on.</h2>
  95. <p>We regulary apply Coverity, <a
  96. href='http://sourceforge.net/apps/mediawiki/cppcheck/'>cppcheck</a>,
  97. and <a
  98. href="http://clang-analyzer.llvm.org/scan-build.html">scan-build</a>.
  99. We've as yet been unable to eliminate all scan-build warnings, but we
  100. require the code to audit clean under all the other tools on each
  101. release. </p>
  102. <p>We used to use <a href='http://www.splint.org'>splint</a>, until
  103. we found that we couldn't replicate the results of splint runs
  104. reliably in different Linux distributions. Also, far and away the biggest
  105. pain in the ass to use. You have to drop cryptic, cluttery magic
  106. comments all over your source to pass hints to splint and suppress its
  107. extremely voluminous and picky output. We have retired in favor of more
  108. modern analyzers.</p>
  109. <p>cppcheck is much newer and much less prone to false
  110. positives. Likewise scan-build. But here's what experience tells us:
  111. each of these tools finds overlapping but different sets of
  112. bugs. Coverity is, by reputation at least, capable enough that it
  113. might dominate one or more of them - but why take chances? Best to use
  114. all and constrain the population of undiscovered bugs into as
  115. small a fraction of the state space as we can.</p>
  116. <p>We also use <a href='http://valgrind.org/'>valgrind</a> to check
  117. for memory leaks, though this is not expected to turn up bugs (and
  118. doesn't) due to our no-dynamic-allocation <a href="hacking.html#malloc">house
  119. rule</a>.</p>
  120. <h2>We are methodical and merciless</h2>
  121. <p>Neither magic or genius is required to get defect densities as
  122. low as GPSD's. It's more a matter of sheer bloody-minded persistence -
  123. the willingness to do the up-front work required to apply and
  124. discipline fault scanners, write test harnesses, and automate your
  125. verification process so you can run a truly rigorous validation with
  126. the push of a button.</p>
  127. <p>Many more projects could do this than do. And many more projects
  128. should.</p>
  129. <hr>
  130. <script src="datestamp.js"></script>
  131. </div>
  132. </body>
  133. </html>