Learnings from the Advanced Download Widget

Recently, we released a new feature to the download platform that powers the Sun Download Center.  Internally, we refer to it as the Advanced Download Widget (ADW).  Essentially, it’s a Web component that can be deployed on any Sun-branded Website, to deliver an integrated and streamlined download experience.  Working with Lifecycle Marketing, we have also integrated the ability to present the user with a free offer (e.g. whitepapers, training, etc.) that complements the download.  It’s completely optional, but to receive the free offer, the user will need to login with a Sun Online Account or create a new one.  To see it in action, checkout the Java ME SDK 3.0 download page.  Although the project was generally successful, there were some lessons learned (in terms of went well and what can be improved from my own personal perspective) that I would like to share via this blog for future reference.

When developing new Web functionality that relies heavily on Web browser technology, it’s important to understand your users.  Do they largely run on a single Web browser / platform combination (Intranet apps), or do they span the gamut in browsers and platforms used?  The answer may greatly affect your project plan and testing strategy, so find out before you start on your project.  Sites such as Wikipedia publish aggregated Web browser usage stats for the Internet, but it is better to take your own measurements if possible.  Our sampling of a very popular Java download yielded slightly different distribution with 48.6% running Internet Explorer and 40.8% running Firefox.  The data helped shape our testing strategy.

Given the constant evolution of the Web browsers, it’s not always practical to maintain backward compatibility to outdated Web browsers; however, forward compatibility for new releases should definitely be a priority.  Internally, you want to establish guidelines on the Web browser makes and versions, as well as the browser platforms that you can support.  This way, you can provide Engineering and QA with clear expectations on the testing scope and staffing needs.   It’s also good to have designated personnel who keep tracks of the product roadmap for key Web browsers.  During the development phase for the ADW, new version of Firefox (3.5) and Safari (4.0) were released, but they were not on our radar.  We later uncovered some minor incompatibilities with these Web browsers during the testing phase that prompted additional round of testing and contributed to some avoidable schedule delays.

One key aspect of project management is accurate planning of the time and resource it takes to complete the project.  While the Development and QA phase are largely determined by Engineering’s estimates, the business generally drives User Acceptance Testing (UAT) and therefore estimate the resources and duration required for UAT.  Because the project scope varies from release to release, successful UAT planning requires a good blend of gut feel (art) and common sense (science).  By applying past experience and intimate knowledge of the system to the bug fixes and enhancements in scope, you can devise a rough approach to the UAT test plan and test cases.  From the estimated time for completing each test case and the availability of UAT testers, you can then derive the duration required to complete 1 round of user testing.  When testing new features that relies heavily on Web browser technology, be sure to add extra time and/or testers for targeted cross browser/platform testing.  Finally, allow time for bug fixes and at least a second round of user testing to make UAT a success.

Finally, while the project life cycle ends when the release goes live, the product life cycle continues on.  By product, I’m referring to the system or the download platform in this case.  Although it’s possible to roll out new features that meet the user’s needs on day 1, quite often the road to nirvana involve a couple design iterations.  To avoid the design-in-a-vacuum pitfalls, it’s essential that you have ways to collect feedback and gain insights into the user’s real world interactions with your system.  In our case, we chose to conduct a usability study to gather feedback from a diverse group of users whose experience with Sun ranges from none to developers and Sun customers who are quite familiar with our Websites.  Although the usability study was insightful, we plan to integrate Omniture into future releases so we can measure and assess the usability of the ADW across the entire user base.  Meanwhile, we have other ways (see my blog on The Voice of the Customer) to collect and act on customer feedback as well.