LegalBizDev

The latest post from Jim Hassett’s blog Legal Business Development.


Tip of the month:  Involve team members in planning near the start of each large matter

On large matters, invite team members to participate in the early planning to get their buy-in on budgeted time estimates, and to assign tasks across the team in a way that maximizes efficiency by taking advantage of each individual’s personal strengths and available time.

The first Wednesday of every month is devoted to a short and simple tip like this to help lawyers increase efficiency, provide greater value to their clients and/or develop new business. For more about this tip, see our Legal Project Management Quick Reference Guide.

      


How to evaluate legal project management programs (Part 3 of 3)

When you want to evaluate an LPM program, the best source of information is to rely on the opinion of the people who are in a position to make an informed judgment: the lawyers themselves. In our coaching, for example, if a lawyer gets new business that seems to be related to their LPM activities, we simply ask them: what do you think? Was LPM partly responsible for this? Whether they say yes, no, or maybe, we don’t second guess them; we merely record the opinion and move on.

At the highest level, this means listening to the opinion of the decision makers who ultimately decide whether to invest in LPM or not. For example, as Hanson Bridgett Managing Partner Andrew Giacomini summed it up in the middle of one of our ongoing coaching programs, “I don’t have any statistics, but if I didn’t believe that LPM was producing a return, I wouldn’t keep investing in it.” After each program concludes, he continued, “You can see the energy that lawyers are putting into applying it to their practices. As they pass these tools along to others, it creates new strengths for the entire firm. And if our lawyers become the best at LPM, they will get noticed.”

In the first few years we offered LPM coaching, lawyers’ reports of its effects were limited to internal reports and occasional public case studies. More recently, we’ve created systematic reports of results both when a program ends and 90 days later, in a format like Table 1 below. This can be circulated internally to assess what works best and to publicize success and generate enthusiasm for spreading the approach.

Note in Table 1 that we recommend that the first column identify the person who first made the change. However, depending on its culture, some firms may prefer to eliminate the “who” column. Since LPM takes time, the benefits column should include not just benefits experienced to date but also the potential benefits for the future. The more specific this can be, the better.

Table 1

Sample format for tracking LPM benefits

Who?

LPM behavior change

Benefits

     
     
     
     
     

 

Table 2 includes typical results from past coaching programs, with some details changed slightly to protect the confidentiality of the firms.

Table 2

Examples from past LPM coaching programs

LPM behavior change

Benefits

For every matter over $50K, lawyer shared a description of project scope and assumptions with everyone on the project team

Team members became more familiar with what each budget included and excluded, which improved cost predictability and client satisfaction

Required lawyers to use a special task code to identify any work that was performed despite the fact that it was technically beyond scope

Kept lawyers more aware of the scope of the agreement, enabled relationship partner to negotiate increased scope with the client where appropriate

A lawyer established a procedure to provide written summaries of strategic objectives to clients for their review at the beginning of every new matter.  This later was adopted by his entire firm

Improved client satisfaction, led to more accurate budgets and increased realization

At the start of a large matter, one lawyer used our matter planning template to create a list of key sub-tasks and assignments, then asked team members to estimate how many hours each sub-task would take them

Team members completed most tasks within the time estimates they provided, which led to more accurate bids, increased realization, and new business

A litigator explained our risk analysis template to a key client and then used it to assess their budget in an early case assessment

The client loved the template and used it to structure their discussion of risks vs. costs. The result was increased client satisfaction and cost control.

The lawyer developed a new fixed fee product for consultations in a specialized area by working with a coach to identify all sub-tasks required and the range of possible time to complete each

Increased new business by offering a fixed price product in a specialized area before competitors did

One lawyer added a cover memo to monthly invoices with a bullet point summary of the progress of each matter on the invoice and the expected remaining costs

By explaining the rationale for each fee and what to expect, the lawyer avoided surprises and increased realization

A litigator developed a checklist of questions to ask at the beginning of each case to better define scope and assign lawyers to cases

More accurate bids, better team assignments, lower costs to clients

The lawyer arranged to have the accounting department to provide “tickler” emails sent automatically when certain financial milestones were reached, such as when 50% of the budget was spent.

Improved budget tracking led to cost control and avoided surprises to clients by enabling early discussions of possible scope changes

For a multi-million dollar flat fee for handling a large number of litigations, the relationship partner designed spreadsheets showing cost to date and cost to estimated completion for each case. This made it easier to quickly spot where there were significant overages in attorney time spent, above the flat fee for a given month

Early identification of possible problems improved discussions of why any cost overruns may have occurred in a particular case and ways to control overruns in the future. Ultimately, this led to the fixed fee arrangement becoming more profitable

IP lawyer used our matter planning template to simplify the steps required to complete patent applications for a key client. The lawyer identified 12 steps that were required for every patent application and a likely range of hours for completing each step

Team members were able to easily compare their effort on each phase against expectations and increase efficiency. This improved client satisfaction and increased new business

At the end of a matter, the relationship partner conducted a short lessons learned review with the client

The discussion led not only to ideas for increasing efficiency but also to being assigned similar matters in the future

Senior partner who had to approve write downs identified a few key partners with high write-down rates and interviewed them about the causes and possible cures

Each lawyer developed a personal action plan to reduce write-downs, and the firm improved realization

A practice group required team leaders to hold weekly internal team status meetings for each matter over $100K

Avoided duplication of effort and led to early identification of issues that could increase scope

As firms increasingly use this type of evaluation to document the results of their LPM programs, the approaches that work best for each lawyer and each practice group will gradually spread to the entire firm. The firms that follow this path first will achieve the greatest benefit by giving clients what they asked for in the Altman Weil CLO survey quoted at the start of this piece: LPM, LPM, and more LPM.

This series has been adapted from the fourth edition of the Legal Project Management Quick Reference Guide, which will be published this fall.

      


How to evaluate legal project management programs (Part 2 of 3)

Identifying the best way to evaluate a particular training or coaching program is challenging for any business. Donald Kirkpatrick “wrote the book” that most professional trainers use in this area: Evaluating Training Programs: The Four Levels. The title refers to four different ways Kirkpatrick says training can be evaluated, from the easiest approach to the most difficult:

  • Level One – Reaction – measures how students feel at the end of a program. For example: “How useful was this workshop to you?” This level requires just a questionnaire and is often the only way training programs are measured, since it’s so easy.
  • Level Two – Learning – measures how well students have mastered the course content with test questions such as “Please define LPM.” This is considered more compelling evidence of whether a training program works. However, again it only requires a paper test and may or may not influence job performance.
  • Level Three – Transfer to the job – measures how the knowledge and skills from a course or coaching program are used on the job, often several months after the program ends. For example, one could conduct interviews asking lawyers whether they had changed specific aspects of their practice after LPM coaching or training, and if so to provide specific examples.
  • Level Four – Organizational impact – measures the business impact of a program such as quality improvements. In the case of LPM, these could include such benefits as:
    • Increasing client satisfaction
    • Delivering greater value to clients
    • Protecting business with current clients
    • Increasing new business
    • Increasing the predictability of fees and costs
    • Minimizing or eliminating surprises to clients and to the firm
    • Improving communication with clients
    • Managing budget and schedule risks
    • Improving realization
    • Increasing profitability

As you ascend from Level One to Four, each level requires a bit more time and money to measure. As a result, while Level One evaluations are nearly universal, Levels Three and Four are far less common among professional trainers. Unfortunately, they are also the only levels that most businesses ultimately care about. It’s nice if employees smile at the end of a workshop (Level One) or can pass a test (Level Two), but unless it actually changed what they do on the job (Level Three) and ultimately benefits the organization (Level Four), few firms would invest in training.

Interestingly, when the LPM movement was just getting started, a report from the Association of Corporate Counsel and the ABA reinforced the idea that Levels One and Two were not enough. An article published in ACC Docket described a meeting “at which leaders of corporate and law firm litigation departments rolled up their sleeves and tackled the complex issues surrounding present day concepts of value in litigation.” After the meeting, the organizers concluded that future progress in LPM will not be based on improved understanding or increased knowledge. Instead, “The challenge is change/behavior management.” It’s not a question of knowing what to do; it’s a question of helping lawyers to do it.

When firms get serious about evaluating LPM, the first thing many think about is somehow measuring its ROI (return on investment). The underlying math is simple: ROI = (Return – Investment) / Investment. For example, if you invest $1000 in a one year bond that pays $1050, your ROI is 5% (($1050-$1000) / $1000).

This simplicity is appealing, but it is also deceptive when it comes to training and coaching. For one thing, according to Dominique Hanssens, UCLA Anderson Graduate School of Management’s Bud Knapp Distinguished Professor of Marketing, these are not good examples of one-time investments. (As your CFO would say, training and coaching are expenses on the firm’s profit and loss statement, not assets on the firm’s balance sheet.)

For another thing, the full impact of a training program could come months or years after its completion and is almost impossible to isolate from other factors. If a lawyer completes an LPM program, increases client satisfaction by avoiding surprises, and gets more business from that client in the future, did LPM coaching CAUSE the increased satisfaction and new business? Of course not. It was just one factor in a complex situation.

There is some evidence that investing in efficiency pays off for law firms. Altman Weil’s 2015 Law Firms in Transition reported that firms that said they had changed their strategic approach to efficiency were more likely to report that revenue per lawyer was up (76% of firms that changed had increased revenue per lawyer in the previous year vs. 62% of firms that had not changed) and that profits per equity partner were also up for a higher percentage of the firms that had changed (76% vs. 61%).

While this is thought provoking, scientific purists would point out that correlation does not imply causality. Perhaps it was a third factor such as effective firm leadership that produced both the efficiency and the increased revenue and profitability.

At the end of the day, it is a fool’s errand to try to scientifically PROVE that LPM – or any other single activity in a complex situation – is solely responsible for any of the possible benefits listed above.

Despite these barriers, convincing ways of evaluating LPM programs have started to appear, as explained in next week’s final post in this series.

This series has been adapted from the fourth edition of the Legal Project Management Quick Reference Guide, which will be published this fall.

 

      


How to evaluate legal project management programs (Part 1 of 3)

Clients want law firms to improve legal project management (LPM).

In its 2015 Chief Legal Officer (CLO) Survey, Altman Weil asked 258 CLOs, “Of the following service improvements and innovations, please select the three you would most like to see from your outside counsel.” The three that clients picked most often were greater cost reduction (50%), improved budget forecasting (46%), and more efficient project management (40%). Since LPM leads to improved budget forecasting and to cost reductions, you could say that the top three client requests were LPM, LPM, and more LPM. Just as they were in the previous year’s Altman Weil survey, and the one before that.

And CLOs are not impressed with what law firms have accomplished to date. In the CLO survey, Altman Weil asked respondents to rate how serious law firms are “about changing their legal service delivery model to provide greater value to clients” on a scale from 0 (not at all) to 10 (doing everything they can). The median rating was just 3.

Law firm leaders also recognize the need for change. In a separate Altman Weil survey – the 2015 Law Firms in Transition Survey – managing partners were asked for their opinions on which of 14 current trends were most likely to be permanent. The number one need, according to 93%, was an increased focus on practice efficiency, which is the goal of LPM. But when the same survey asked, “Has your firm significantly changed its strategic approach to efficiency of legal service delivery?” only 37% said yes. So 93% of firm leaders think more LPM is needed, but only 37% are doing something about it. Why?

One key reason for the slow pace of change is that if there’s one thing lawyers are good at, it’s being skeptical. That’s what makes it so important to evaluate LPM programs in this time of transition: lawyers need proof that it helps them increase client satisfaction and more.

To date, most “evaluations” have been intuitive judgments. And they have not always helped the cause of LPM, because there have been a number of false starts and missteps.

In 2010, Dechert announced that it had trained all its partners in LPM. Lawyers love precedent, so this led to a small fad of large-scale LPM training programs, which in turn led to some great press releases but precious little in the way of behavior change. As the chair of one AmLaw 200 firm that invested heavily in LPM training put it in our survey of Client Value and Law Firm Profitability (p. 193):

Every shareholder and top level associate [in our firm] has had a full day of project management training. I’d like to tell you that they use it, but they don’t.

To this day, training programs enable firms to write responses about their commitment to LPM in their RFP responses. But they have not convinced skeptics nor gotten many attorneys to change the way they practice law.

With the wisdom of hindsight, this lack of impact of education on behavior change should not have been a surprise to anyone. Taking a course or reading a book about how to lead a healthier life by quitting smoking, eating more vegetables, and exercising regularly does not mean that you will actually do any of these things. Training professionals have amassed a large body of research from other professions showing that education does not necessarily change behavior.

For example, according to one medical white paper:

Traditional medical education programs are effective for the transfer of knowledge, skills, and attitudes, yet ineffective in changing physician behavior… Physicians report spending about 50 hours per year in continuing medical education (CME) activities… There is an underlying belief associated with CME activities that health care professionals will improve how they practice, which will in turn improve patient outcomes. Despite this belief, many studies have demonstrated a lack of effect from formal CME.

A white paper from the engineering profession makes a similar point and then explains what does work:

Using skills and knowledge within the work environment makes learning stick, causing behavior change… In this step it is important to experience early success. This early success depends on leadership support and coaching. The system, and often the people, resists change. Employees need someone supporting them with encouragement and coaching, and running interference as they attempt to adapt their behavior.

How did experts in other professions learn these lessons? They conducted formal evaluation programs.

The only way to find out what works is to try it and measure the results. What works with one client may fail with another. What works for you may fail for your partner. And what works this year may fail next year. The world changes, and clients and law firms must change along with it. The only thing that does not change is the need for testing. As it says in the Bible (1 Thessalonians 5:21): “Test everything, retain what is good.”

There is a large body of research about the best ways to promote behavior change in large organizations. In the book Leading Change (p. 123), John Kotter, professor emeritus at the Harvard Business School, argues that one of the most effective tactics is to create short-term wins which help create internal champions. This is also the key to behavior change in law firms. In its survey Legal Project Management: Much Promise, Many Hurdles, ALM Legal Intelligence concluded (p. 17) that, “The quicker there are demonstrable positive benefits, the faster other partners will take notice.”

Once firms overcome the initial resistance to LPM by providing examples of its success, they must continue to monitor its application and results.

What gets measured gets done. The origin of the phrase has been attributed to management gurus from Tom Peters to Peter Drucker, as well as Lord Kelvin, the 16th century astronomer Rheticus, and others. Whoever said it first, the idea is important for a very simple reason: It’s true. What gets measured does get done.

Do you know anybody who wears a Fitbit to track the number of steps they walk every day? If you don’t yet, you will soon. According to one recent estimate, Fitbit has over 19 million registered users, and its revenues for its most recent fiscal year were $1.86 billion. Not bad for a company that was founded less than a decade ago. And when you add in competitors such as Garmin, Jawbone, Misfit, and others, the total market for wearable fitness trackers is much larger.

Why has the demand for these devices grown so rapidly? There is also an important truth behind the popularity of these devices: if you count your steps each day, look at the results over time, and maybe even use Fitbit’s online features to compete with friends, you are likely to walk more and ultimately become more fit.

So if lawyers want to improve their legal project management behavior, measuring the results will help assure that they reach their goals.

As Lou Gerstner, former chairman of IBM, put it in his book Who Says Elephants Can’t Dance? (p 231):

Execution is all about translating strategies into action programs and measuring the results… Proper execution involves building measurable targets and holding people accountable for them.

One way or another, it always comes back to measurement. When sales training guru Tom Snyder reviewed several decades of research on sales training, he concluded that, “Only those things that are measured will get done; the engine of change is measurement.”

By evaluating LPM programs more carefully, changing tactics based on what works, creating internal champions, and continuing to measure results, firms will be in a much stronger position to prosper in challenging times by offering clients what they want and need. They will also be better able to invest their limited time and money in the programs that are most likely to produce results.

But what exactly should firms do to evaluate LPM programs? The rest of this series will summarize the state of the art.

This series has been adapted from the fourth edition of the Legal Project Management Quick Reference Guide, which will be published this fall.

      


Tip of the month:  Set a schedule for internal and external progress reports

At the beginning of a large matter, define exactly when internal team members should report progress to the responsible attorney, and also when and how the responsible attorney should report progress to the client.

The first Wednesday of every month is devoted to a short and simple tip like this to help lawyers increase efficiency, provide greater value to their clients and/or develop new business. For more about this tip, see our Legal Project Management Quick Reference Guide.

      



Click here to safely unsubscribe from "Legal Business Development and Project Management."
Click here to view mailing archives, here to change your preferences, or here to subscribePrivacy