By Jennifer deJong – January 19, 2011

View original story

It hasn’t exactly advanced on all fronts, but ALM is inching its way forward, albeit by a circuitous route.

SD Times asked analysts and tool makers how Application Life-cycle Management—the tools and processes designed to track software projects from conception to deployment to retirement—has evolved in the last few years and where it is headed. Three major themes emerged.

First, the central promise of ALM 2.0—better integration between point tools for requirements, coding and testing—never really materialized. Second, tool makers are overwhelmingly positioning their offerings as “Agile ALM.” Second, the term is so pervasive, it’s difficult to separate the “agile” marketing message from actual support for agile practices. What’s more, opinions vary on what “Agile ALM” really means. Third, technologies such as application security and virtualized testing, among others, remain largely outside the ALM process today, even though they are widely recognized to play important roles in how applications are developed and managed.

About that ALM 2.0
The plug-in ALM frameworks designed to let application teams stitch together different tool makers’ offerings without having to code the connections between them did not happen, said Forrester principal analyst David West. These frameworks were a key promise of ALM 2.0, a Forrester Research set of principles that outlined ALM’s future and gained wide backing from ALM tool makers in 2008.

West noted that Eclipse projects such as ALF and Corona, aimed at tool and process integration, have failed. Unless application delivery teams are working in the Microsoft Visual Studio, or Eclipse IDEs, “plug-in support is limited today,” he said.

In the ALM 2.0 era, tool makers were talking about a standard way to link together tools for requirements, coding and testing, and everyone appeared ready to take the next step, added Chris Clarke, vice president of product management for distributed software development solution provider CollabNet. “It was a heady day for ALM.”

But today there is no clear solution to tie point products together, added CollabNet senior vice president of marketing Victoria Griggs. “There is still a lot of band aids and bubble gum going on—

[the process] needs to be more seamless than that.”

One emerging solution is Open Services for Lifecycle Collaboration (also known as OSLC, or Open Services). Proposed by IBM, this community effort is designed to help software delivery teams more easily use life-cycle tools in combination. West said OSLC is the only open standard effort working on this problem today. “Beyond IBM and its partners, adoption is limited,” he said.

He also noted one ALM 2.0 promise that has been realized: widespread support for multiple software change and configuration management (SCCM) tools. “Today, all vendors are offering integration with multiple SCCM repositories,” he said.

What the heck is Agile ALM? 
A few years ago, when agile teams were known for tracking progress with sticky notes, not software tools, the term “Agile ALM” didn’t exist. The words contradict each other, said Voke analyst Theresa Lanowitz.

“Agile and ALM cannot exist together,” she said, adding that “agile” is concerned only with development practices, while ALM has a broader reach, from planning to coding, testing, and deployment.

But hers is largely a contrarian view. Others see Agile ALM as a natural extension, a blending of the two ideas. Cliff Utstein, vice president of marketing at software tool provider AccuRev, acknowledged that when a team first adopts agile, there isn’t a tremendous focus on tools. “But as agile projects scale out to dozens of developers and testers working at separate locations, ALM tools become critical,” he said.

CollabNet’s Griggs took it a step further. “Agile needs ALM to meet its own principles. You can’t do [things like] continuous integration and incremental software delivery on a large scale without tools,” he said.

Charles Chu, director of product management and strategy for IBM Rational, said it’s important to step back from the hype and hysteria surrounding Agile ALM.

“If you look at what’s really happening, Agile ALM is the embodiment of the set of best practices we have learned in 50 to 60 years of application development,” he said. “It’s a natural evolution—the current thinking about how best to manage the life cycle.”

Impact on agile practices 
As that thinking takes hold, ALM tool makers are adding support for agile practices including short iterations (where the team delivers working software every one to three weeks); continuous integration (where team members integrate their work at least daily); refactoring (making code more efficient without changing its behavior); user stories (high-level definition of a requirement); and backlogs (list of deliverables ranked according to business value).

Tools must not only support these practices, but also automatically trigger the activities related to the practice. For example, a tool should “kick off a build every time you check in code,” said AccuRev founder and CTO Damon Poole.

Forrester’s West said that while ALM tool makers have begun to support agile practices, the link between the tool and the practice is sometimes “transient at best.” He did not single out specific tools or offer examples.

Tool support for agile practices makes ALM teams more efficient, insisted Poole. For instance, with traditional development efforts, team members typically merge their changes to code every six weeks. With agile, that happens automatically on a daily basis. “If you are spending your time reconciling the merge and the changes, you are not going to be successful,” said Poole.

Kevin Parker, vice president and chief evangelist of tool provider Serena Software, elaborated on that theme. Agile ALM automates ongoing release management, he said. Traditional application development projects typically deliver software every six months, but agile efforts are likely to release working software every two weeks.

“Instead of building the battleship and releasing it in six months, you build the most important features first and deliver them on an incremental basis,” he said, adding that because Agile ALM has shortened the development cycle, it has forced application teams to come up with more efficient ways to define requirements—a task they have struggled with for years. In the past, business analysts interviewed project stakeholders to find out what they needed the software to do. “What you end up with is a 400-page document of dreams and ambitions,” he said.

Agile projects require a streamlined approach, where only the current space (what the software does now) and future space (what you want it to do in the future) are defined. “That allows you to focus on getting from point A to point B,” Parker said. “It’s faster and more scientific.”

The big questions 
For all the improvements it may offer, agile ALM also raises questions. Chief among them is what happens to traditional lines separating developers and testers, said Tieren Zhou, founder and CEO of tool provider TechExcel.

In agile, QA and development are essentially one, but in traditional development efforts, they are separate, he said. “Where does the QA team fit in agile ALM?” The answer is still unknown, but he predicted that companies will take the one-team approach, and by next year, agile development be fully integrated with quality management.

Zhou said he believes ALM tools should reflect the methodology they are designed to carry out. “As tools mature, the methodology will be completely blended with the tool.” He also said that agile isn’t the answer for every project; waterfall development still has a role. “An update to a banking application, where the requirements are well understood and clearly specified, is a good fit for waterfall,” he said as an example.

What is clear is that the agile movement has wreaked havoc with traditional role-based ALM, where point products for requirements, coding and testing align closely with the business analyst, developer and tester roles, said Microsoft senior product manager for Visual Studio ALM Matt Nunn.

“The industry is moving away from role-based offerings, because the flow across the life cycle is more important than the individual things you do,” he said.

If agile continues to gain footing in ALM, it may be not about agile per se. “What really matters is business alignment, not agile,” said Voke’s Lanowitz. She said the word “agile” has caught on because it is good word, but what it really means is “customer responsiveness.”
The most important aspect of application life-cycle management is “staying attuned to your line of business,” Lanowitz said. HP vice president Jonathan Rende echoed that view: “These days, the business process represented in software applications is the innovation in the business itself.”

What about application security?
Even as ALM continues to evolve, some established technologies still aren’t part of the life-cycle process. “In most organizations, security professionals still own security,” said Rende, noting that security testing takes place after software is developed and ready to be deployed. Penetration testing needs to move upstream into QA, and source code analysis should take place when developers check in code, he said.

In 2010, HP acquired Fortify Software, which makes tools to scan source code for security flaws. In 2007, HP bought SPI Dynamics. Its penetration tools test application security by simulating malicious attacks.

IBM’s Chu said the company sees customers implementing both types of testing earlier in the life cycle, but he conceded that security has not integrated into most organization’s life cycles in a satisfactory way. “There is recognition that it is a desirable thing. People are moving in that direction,” he said. IBM acquired source code analysis tool maker Ounce Software in 2009, and bought penetration tool maker Watchfire in 2007.

Is virtualized testing on the radar? 
Another technology that is yet to gain firm footing in the ALM process is virtualized testing, which provides a way to test applications without having to rely on the operations group to configure physical servers for testing.

“Virtualization has taken off in data centers, but we haven’t seen many organizations using it to test applications as part of the ALM process,” said Lanowitz. One reason adoption has been slow is that, until recently, none of the major tool makers were pushing it, she said.

Now Microsoft is leading that effort, she said. The company launched virtualized test offering Lab Management, which is part of Visual Studio, in August 2010.

Nunn said that virtualized testing is crucial to ALM, and not just because it eliminates the cost of maintaining a physical test infrastructure. It also makes it easier for ALM teams to share highly specific information about the test environment in which the error occurred, he said. That gets around a common problem: Testers pinpoint errors and report them to developers, but developers are often unable to replicate the error, he said.

Asked why virtualized testing has been slow to take off, Nunn said, “Operations need to be more comfortable with [providing virtual test environments]. There is a time investment in moving from one way of doing things to another.”

Microsoft expects virtualized testing to take off in the next couple of years, according to Nunn.

Not happening in ALM 
Also in ALM’s future is product portfolio management (PPM). “ALM efforts today typically focus on a single application,” said Nunn. A portfolio management approach takes all of a company’s applications into account, deciding which applications to build, which to keep and which to retire, he said.

Organizations have been slow to adopt PPM because it’s difficult to “insert life-cycle management on an existing set of applications,” he said. In addition, organizations tend not to plan end-of-life for applications, but they should.

“As apps age, they become expensive to maintain. Documentation is out of date and they become difficult to look after. And developers who wrote them may no longer work for the company,” said Nunn.

Another area that may eventually fall under ALM’s wing is the customization of line-of-business applications, such as those offered by SAP. There is no good support for that in ALM today, even though organizations invest an enormous amount of time in that activity, said Nunn.

“Will packaged software vendors implement life-cycle management in their own tools? Or will ALM efforts ultimately support these applications?” he wondered. “No one is sure which way it will go.”