reliability.html 6.3 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160
  1. <!DOCTYPE HTML>
  2. <html lang="en">
  3. <head lang="en">
  4. <meta http-equiv="Content-Type" content="text/html; charset=utf-8">
  5. <meta name="Author" content="Eric S. Raymond">
  6. <meta name="Description" content="gpsd is a utility that can listen to a GPS or AIS receiver and re-publish the positional data in a simpler format.">
  7. <meta name="Keywords" content="GPS, translator, GIS">
  8. <meta name="Revised" content="9 April 2015">
  9. <meta name="robots" content="index,follow">
  10. <!-- the following is a verification tag for Google Analytics -->
  11. <meta name="verify-v1" content="kb4f3qxJPMaOltYTpJHnvbnteza1PHO4EhrRIfcq2gk=">
  12. <title>GPSd &mdash; Put your GPS on the net!</title>
  13. <link rel="stylesheet" href="main.css" type="text/css">
  14. </head>
  15. <body>
  16. <div id="Header">
  17. How We Engineer For High Reliability.
  18. </div>
  19. <div id="Menu">
  20. <img src="gpsd-logo-small.png" alt="Small gpsd Logo" height="126"
  21. width="105"><br>
  22. Home<br>
  23. <div>
  24. <a href="#news">News</a><br>
  25. <a href="#downloads">Downloads</a><br>
  26. <a href="index.html#install">Installation &amp; Building</a><br>
  27. <a href="#mailing-lists">Mailing lists</a><br>
  28. <a href="#documentation">Documentation</a><br>
  29. <a href="faq.html">FAQ</a><br>
  30. <a href="xgps-sample.html">Screenshots</a><br>
  31. <a href="#recipes">Recipes</a><br>
  32. <a href="#others">Other GPSDs</a><br>
  33. <a href="hardware.html">Hardware</a><br>
  34. <a href="for-vendors.html">For GPS Vendors</a><br>
  35. <a href="wishlist.html">Wish List</a><br>
  36. <a href="hall-of-shame.html">Hall of Shame</a><br>
  37. <a href="troubleshooting.html">Troubleshooting Guide</a><br>
  38. <a href="hacking.html">Hacker's Guide</a><br>
  39. <a href="references.html">References</a><br>
  40. <a href="protocol-transition.html">Application Compatibility</a>
  41. <a href="history.html">History</a><br>
  42. <a href="future.html">Future</a><br>
  43. </div>
  44. <div>&nbsp;</div>
  45. <a href='http://www.catb.org/hacker-emblem/'><img
  46. src='glider.png' alt='hacker emblem' height="55" width="55"></a><br>
  47. <script src="https://www.openhub.net/p/3944/widgets/project_thin_badge.js"></script>
  48. <hr>
  49. <script><!--
  50. google_ad_client = "pub-1458586455084261";
  51. google_ad_width = 160;
  52. google_ad_height = 600;
  53. google_ad_format = "160x600_as";
  54. google_ad_type = "text";
  55. google_ad_channel = "";
  56. //--></script>
  57. <script src="https://pagead2.googlesyndication.com/pagead/show_ads.js">
  58. </script>
  59. <hr>
  60. <a href="https://validator.w3.org/check/referer"><img
  61. src="html5.png"
  62. alt="Valid HTML 5!" height="31" width="88"></a>
  63. </div>
  64. <div id="Content">
  65. <p>GPSD has an exceptionally low defect rate. Our first <a
  66. href='http://coverity.com'>Coverity</a> scan, in March 2007, turned up
  67. only 2 errors in over 22KLOC; our second, in May 2012, turned up just
  68. 13 errors in 72KLOC, all on rarely-used code paths. Though the
  69. software is very widely deployed on multiple platforms, we often go
  70. for months between new tracker bugs.</p>
  71. <p>Here's how that's done:</p>
  72. <h2>We have an extensive suite of unit tests and regression tests</h2>
  73. <p>GPSD has around 100 unit tests and regression tests, including sample
  74. device output for almost every sensor type we support. We've put a lot of
  75. effort into making the tests easy and fast to run so they can be run
  76. often. This makes it actively difficult for random code changes to
  77. break our device drivers without somebody noticing.</p>
  78. <p>Which isn't to say those drivers can't be wrong, just that the ways
  79. they can be wrong are constrained to be through either:</p>
  80. <ol>
  81. <li> a protocol-spec-level misunderstanding of what the driver is
  82. supposed to be doing, or </li>
  83. <li> an implementation bug somewhere in the program's
  84. state space that is obscure and difficult to reach. </li>
  85. </ol>
  86. <p>Our first Coverity run only turned up two driver bugs - static
  87. buffer overruns in methods for changing the device's reporting
  88. protocol and line speed that escaped notice because they can't be
  89. checked in our test harnesses but only on a live device.</p>
  90. <p>This is also why Coverity didn't find defects on commonly-used code
  91. paths. If there'd been any, the regression tests probably would have
  92. smashed them out long ago. A great deal of boring, grubby, finicky
  93. work went into getting our test framework in shape, but it has paid
  94. off hugely.</p>
  95. <h2>We use every fault scanner we can lay our hands on.</h2>
  96. <p>We regulary apply Coverity, <a
  97. href='http://sourceforge.net/apps/mediawiki/cppcheck/'>cppcheck</a>,
  98. and <a
  99. href="http://clang-analyzer.llvm.org/scan-build.html">scan-build</a>.
  100. We've as yet been unable to eliminate all scan-build warnings, but we
  101. require the code to audit clean under all the other tools on each
  102. release. </p>
  103. <p>We used to use <a href='http://www.splint.org'>splint</a>, until
  104. we found that we couldn't replicate the results of splint runs
  105. reliably in different Linux distributions. Also, far and away the biggest
  106. pain in the ass to use. You have to drop cryptic, cluttery magic
  107. comments all over your source to pass hints to splint and suppress its
  108. extremely voluminous and picky output. We have retired in favor of more
  109. modern analyzers.</p>
  110. <p>cppcheck is much newer and much less prone to false
  111. positives. Likewise scan-build. But here's what experience tells us:
  112. each of these tools finds overlapping but different sets of
  113. bugs. Coverity is, by reputation at least, capable enough that it
  114. might dominate one or more of them - but why take chances? Best to use
  115. all and constrain the population of undiscovered bugs into as
  116. small a fraction of the state space as we can.</p>
  117. <p>We also use <a href='http://valgrind.org/'>valgrind</a> to check
  118. for memory leaks, though this is not expected to turn up bugs (and
  119. doesn't) due to our no-dynamic-allocation <a href="hacking.html#malloc">house
  120. rule</a>.</p>
  121. <h2>We are methodical and merciless</h2>
  122. <p>Neither magic or genius is required to get defect densities as
  123. low as GPSD's. It's more a matter of sheer bloody-minded persistence -
  124. the willingness to do the up-front work required to apply and
  125. discipline fault scanners, write test harnesses, and automate your
  126. verification process so you can run a truly rigorous validation with
  127. the push of a button.</p>
  128. <p>Many more projects could do this than do. And many more projects
  129. should.</p>
  130. <hr>
  131. <script src="datestamp.js"></script>
  132. </div>
  133. </body>
  134. </html>