Friday, November 21, 2008

Technical Excelllence

In many professional businesses, high technical excellence is taken for granted -- we assume that having it is "table stakes" for competing.

However, it's not a trivial issue to ask whether and how a firm goes about ensuring that its employees in fact meet high standards of technical expertise, especially in a world where companies tend to signal that revenue generation is a more pressing (if not more important) topic.

Who is best positioned in a professional organisation to judge an employee's technical quality? I assume that it might be that person's supervisor, but there could be some built-in conflicts: what if the supervisor is under economic pressure to meet group goals and hence compromise (a little) degrees of technical excellence?

I'm curious about your experience as to how your firm or company goes about ensuring technical excellence. Is it some combination of:

  1. Training
  2. On-the-job supervision
  3. Peer Review Processes at the Job Level
  4. Annual Performance Appraisals
  5. Reward schemes

Or something else?

6 comments:

Anonymous said...

As both a buyer and seller of technical competence, I find that people and firms consistently overstate their competence.

We have a self assessment system for all recruits, and one area is their technical competence.

We're continually amazed at the people who rate their skills as world class.

The same is true of service firms I've hired.

Client ratings are helpful, but we also find that some clients rate people high by their standards, only because the client is mediocre themselves or lacks exposure to best practices.

Feedback from a best of breed client is far more useful than an average across a bell curve of clients.

Anonymous said...

Technical excellence is taken as a given.

Maybe.

I agree with Jeff above that in my experience humans (I'm not sure about dogs and cats) have a tendency to over-rate their own skills.

This ties in with one of the empirical findings of behavioural finance that says that people tend to over-estimate their own skill.

The real danger here is that the "over-estimation" increases as the level of expertise increases!

The frightening conclusion is that simply attaining more knowledge and skill isn't enough to counter the problem.

As a quality-control mechanism, we use a combination of what we call "technical review" (low level, detailed technical review of calculations and models) and "peer review" which is a high level review of methodology, areas covered, risks considered, scenarios used etc.

This works well as a quality control mechanism, but also provides a way for both the reviewer and the reviewee to learn more about the particular area.

The danger here, as Richard is suggesting, is that what happens when there is more billable work available than hands, eyes and feet?

We all know that the first thing that goes out the window is the relationship-building, research-generating, business building (and fun!) stuff.

Next (from my experience) is the robustness of the review.

We sometimes kid ourselves that we can send out a draft report then review it for major problems.

Anybody who has ever sent out a draft report then tried to change anything material in it realises that this just isn't a good plan.

And I completely agree that it shouldn't be done.

It shows lack of respect for your client and does enormous damage to your credibility and trust in the relationship.

Once the numbers are out, the pressure for the reviewer is to not find errors since it will be so awkward to find them.

Not an ideal scenario.

We've mostly learnt to place peer review right at the top of priorities.

I expect Richard may not agree with this aproach (and I have my own reservations) but by making the peer review process part of performance appraisals, it shows to our wholoe team that we are serious about review, and the team will be rated poorly if they don't follow the right processes at the right time.

I'd be interested to hear how other firms have ensured that technical excellence can be safely assumed.

Anonymous said...

For evaluating technical competence, I like a combination of ratings by supervisor(s), peers, and "customers" (those who receive service).

For ensuring it, training is important, but having a supervisor who's strong on development can make the difference between mid-range performance and top performance.

Anonymous said...

Richard: thanks for your writing, which has been inspirational for myself and my colleagues over many years.

We are especially plans of "True Professionalism" and "The Trusted Advisor", although for practical guidance you can't beat "Managing the Professional Services Firm".

At Fitzgerald Analytics, we screen for a set of core technical skills which are then augmented via both formal and on-the-job training.

Ultimately, however, the most crucial factor in ensuring quality is through rigorous Quality Assurance standards and processes.

In addition to increasing quality of technical service to clients, we find that QA provides a built-in mechanism and forum for coaching, learning, and continuous improvement of consultant skills.

Thanks again for a thought-provoking post.

Jaime Fitzgerald

Anonymous said...

At Obtiva, we have to have different approaches for our different practices areas.

In our ISV practice (what we call the Studio) it almost exclusively based on mentoring, on the job learning and peer review.

For our consulting business we have to rely heavily on screening, references and ultimately, customer feedback.

In addition, I think that the most important factor by far is culture.

To achieve and maintain technical excellence, it has to be part of the fundamental make up of the firm.

Gareth
Lead Consultant, Obtiva

Anonymous said...

Richard --

I've worked for a number of years in both technical (software) and strategic consulting roles.

I'd echo Gareth's comments that the most important factor in developing talent is culture -- people need to see an expectation of excellence all around them.

Hand-in-hand with that is mentorship.

I've seen consultants develop the fastest when they form a strong mentor relationship with senior leaders of their firm.

Prior to developing talent, though is making the right hiring decisions.

I've recently been experimenting with an "apprentice" scheme to hiring.

This involves hiring someone on a contract basis for a defined time period -- say, 1 month.

This gives time to see how they actually work on real problems with my real team.

Putting people into a prolonged, realistic work setting gives you a chance to see how they truly perform, and eliminates many hiring mistakes based on people who "interview well".

This scheme is quite common in Europe, given the permanency of their hiring choices.

Jason Whaley
Owner, Whaley Consulting Services