Peer review: The best technique embedded developers aren't using

May 01, 2012

A little help from my friends? Tool-assisted peer review reduces the painstaking, time-consuming aspects of manual peer review while providing improve...


It’s a common scenario in the software development world: A company pins a lot of hope on a new device, hoping to ride the wave of progress in the industry by getting it to market ahead of the competition. The hardware and software teams work long and hard to make it happen. Once the product reaches consumers, though, a problem quickly arises: it doesn’t integrate well with a market-leading peripheral.

As engineers think back to where things might have broken down, they realize it couldn’t have been during the spec review phase because they routinely review requirements documents, specifications, and test cases. Therefore, the problem most likely occurred during the coding process (see Figure 1). And it probably happened because the software and hardware teams were out of sync and unaware of changes during the design and development phase because they didn’t have a simple way to stay connected.


Figure 1: Defects often arise during the coding process due to a disconnect between the software and hardware teams. (Source: The State of Software Quality in 2011 – Caper’s Jones 12/6/2011)




Hardware designers and software engineers who have been working with embedded code for any length of time have probably encountered a situation like this. If not, it’s certainly an embedded developer’s worst fear: A problem with a product that has already reached the marketplace or is about to be released. It could be an integration issue, a connectivity problem, or a security glitch. Whatever the problem, the outcome is the same; time and money are lost while the code is fixed and the company’s reputation suffers.

When so many embedded teams have experienced these problems, why are they still occurring? The reasons are simple: lack of peer and code review and lack of collaboration between software and hardware teams.

Although reviewing artifacts early in the design phase is a common practice and known to be the best way to detect problems early, automating this process between hardware and software engineers has largely been impossible. Although few would disagree that code reviews are always a good idea, other pressures often take precedence. Of course, there are exceptions, mainly in regulated and safety-critical products. However, according to embedded code expert Jack Ganssle, about 98 percent of embedded developers aren’t doing peer review. That’s a stunning statistic given what’s at stake.

While ignoring code review might have worked in a less complicated, less connected world, using that method today leaves companies open to security, interoperability, and connectivity issues. That can lead to expensive recalls requiring time-consuming fixes after products have reached the market.

Peer review – a process by which team members inspect design documents, artifacts, and source code – helps both software and firmware developers, as well as hardware designers, find more bugs or related design errors earlier in the design and prototyping stages, improving product quality and minimizing costly rework later in the development process. It’s a process where software and firmware developers as well as the hardware design teams can share and review both technical documents and programming code in a timely manner, keeping the teams in sync when issues are found and changes are made.

Put simply, studies show that peer reviews work. Research by Philip Koopman, an Associate Professor from Carnegie Mellon University, found that peer reviews are the most cost-effective way to find bugs, and 40 to 60 percent of defects are found by such reviews. Koopman also found that reviews cost only about 5 to 10 percent of the project cost.

Peer review helps ensure:

  • Higher-quality products in the short term as defects are identified
  • Higher-quality products in the long term as technical debt is better managed
  • Compliance with applicable regulations
  • Interoperability with all potential products, peripherals, and software it may be used with
  • Crisper, better documented, and better organized code
  • Transfer of knowledge across the entire development team

The peer review process also saves money, as evidenced by a study comparing bug defects before and after code review authored by SmartBear Software in conjunction with Cisco Systems. In both cases, the product had 463 bugs remaining after development. Without code review, getting the bug count down to 194 cost $368,000. The code review process not only fixed more bugs, getting the bug count down to 32, but it did so for $152,000.

Sidebar 1



Tool-assisted peer review makes sense

On the surface, manual methods of peer review seem like a good way to introduce peer review without spending the money for an automated tool. But these manual methods, while certainly better than no peer review at all, are time-consuming. Furthermore, it can be difficult to collaborate with team members in different locations or time zones. What’s more, manual methods like ad hoc meetings, water cooler discussions, sending code snippets or PDFs via e-mail, and cutting and pasting code into Word documents tend to be disorganized, and critical points can be lost in the process.

Another consideration is that manual peer review does not produce reportable data. One key to creating support for process improvement is quantifiable results. This is one reason teams wonder if the hours spent in meetings are really worth it.

Automated peer review (also called tool-assisted peer review) solves these problems. With tool-assisted peer review, hardware designers and software engineers can participate in reviews at any time, not on a set schedule. That saves time and increases engineer productivity. Developers also can share and collaborate with team members in different locations. And because all materials are in one place, gathering the right files and design documents is never a problem. In addition, having the review materials and results managed in a reportable database, as well as providing accountability within the review process, helps adhere to multiple regulatory compliance mandates.

Tool-assisted peer review is more efficient and effective than manual peer review. It enables software developers and hardware engineers to catch defects, whether stand-alone or based on changes one team must make that affects the other, earlier in the development process at a time when they are easier and faster to fix. The general rule of thumb is that defects detected later in the process take longer and are more complicated and expensive to fix.

Tool-assisted peer review also provides developers with a host of standard and customizable reports on metrics like defect density, inspection rate, defect detection rate, recent and open defects, lines of code added/modified/deleted, and reviews by change list. These reports and metrics can make a significant difference in the software and hardware development process. With the right metrics, the respective teams can benchmark and improve their processes. For example, it could flag reviews considered trivial or stalled, saving the team valuable time.

Some tools, like SmartBear’s PeerReview Complete, also allow development teams to use a variety of review formats such as Word and PDF documents, 2D drawings, schematics, VHDL code and images; develop custom workflows; create custom reports and metrics; integrate with Eclipse and Visual Studio; create customizable fields for tracking and reporting key Capability Maturity Model Integration (CMMI) audit metrics; and implement administrative and security controls. It also integrates with a development team’s existing issue tracking, development environments, and version control tools. A schematic review is shown in Figure 2.


Figure 2: Using SmartBear PeerReview Complete, an author and reviewers discuss the final aspects of a mechanical drawing before it is used in production.




Tool-assisted peer review for all development artifacts

With these types of capabilities, tool-assisted peer review tools can serve as a comprehensive solution that works with code and all artifacts created during the development process. Requirements documents, hardware and software design documents, schematics, 2D drawings, and test specifications can all be reviewed using the same time-saving tool.

The entire review process comes together in one place, simplifying the existing document review process and extending the review process into code review. The long-established benefits of peer review for design documents can be expanded to the coding process. Code review becomes a normal part of the development cycle, and early detection of defects in code becomes as natural as early detection of defects in design specifications.

A move toward peer review is a positive one for any embedded development team. It eliminates guesswork, improves productivity, saves money, and streamlines workflows. While it generally takes time to implement even a manual review process, tool-assisted peer review provides an immediately impactful peer review process without the headaches of traditional manual approaches.

John Lockhart is the product manager of PeerReview products for SmartBear Software.

SmartBear Software [email protected] |

Follow: Twitter Blog Facebook Google+ LinkedIn YouTube