Agile Retrospective “Now What” in Kanban Form

I introduced and led the agile-based approach of reviewing performance called a Retrospective to the Delivery Services team here at ILANTUS. I’ve learned the process from a couple folks at Rally Software (quick shout to the team there as they recently went public: RALY on NYSE). Most notably, I appreciate Rachel Weston Rowell for her instruction over the years. There are four steps:

1. “What” – share information about performance

2. “Gut” – group sharing of gut reactions or feelings about the “what”

3. “So What” – discuss what all the information and the gut reactions mean

4. “Now What” – talk about what you want to do about it

For us, the particulars of the “What” were:

– Customer Satisfaction: based on surveys we sent out after each project was completed

– Billable Milestone Achievement: did we meet the billable milestones we set with customers. In other words, did we produce results such that we were paid according to schedule.

– Customer Referenceability: Did we create new referenceable customers to help with sales

– New Revenue from Existing Customers: how well did we generate sales from existing customers

– Customer Outages: did we cause any outages to customer systems, if so, how many

This data was all collected and put into a presentable format by me before the retrospective began. We then shared our gut reactions, which took a while since there were over 20 of us in the room. Next, we discussed what it all meant.

When we got to the “Now What”, we realized that some of the things that came up were already being worked on by committees on the team. We came up with the idea of putting all the “Now What” on a Kanban board. For those who are not familiar, a Kanban board is a wonderfully simple way to track work. You can look here for some  info: http://en.wikipedia.org/wiki/Kanban_board.

Our Kanban board had four categories “Backlog – Not started”, “Planned”, “Working (or In Progress)”, and “Completed”.  It was a great way for us to look at our action items from the retrospective, and see where we were in creating the change that we want, and what else we needed to do.

 

Email Signatures – Always On

One thing I find irritating is when I’m in an email conversation with a new connection and it’s time to talk live, instead of continuing the back and forth, and I can’t find their phone number. This happens rather frequently when people choose to not include their signature when they reply to or forward an email. If I don’t have their contact stored (and it is common that I’ll want to call someone new who is not in my contact list), then I have to search my inbox for an email that they originated that contains their signature. Either that or send another email asking for the phone number, and wait for their reply. I know we’re only talking about minutes of extra time. Still, why spend minutes on a task that you shouldn’t need to.

Every email system I’m aware of has an option to include signatures in your reply. Granted, this can also get annoying when you have an email thread that has gone back and forth ten, twenty, thirty times – and there are as many signatures at the bottom. However, I think the usually dead weight at the end of that chain (that doesn’t get in the way) proves it’s worth when everyone’s contact info can be found easily within that thread.

I encourage everyone to include signatures, even on forward and reply, to be more courteous to your new connections when it’s time to move the conversation from email to live voice.

Validated Learning in Enterprise Software Development

I’m reading this really great book – “The Lean Startup”, buy Eric Ries. It has many great ideas on how to build a product. It is about taking the concepts from Lean Manufacturing and applying it to software development. So, the ideas are mostly geared toward software product creation, but can be applied to most types of products. I’m a bit more than halfway through and thus far he has only used consumer internet software products as examples.One of the interesting thought experiments it has engaged me in is trying to apply his ideas to enterprise software products – software that you sell to businesses. Symplified sells to businesses, not consumers.

For example, Ries talks about a process called validated learning. Which is a method by which a company can test the validity of each new product feature it puts out. So rather than being able to say “we just put in features A, B, and C and we altered our marketing strategy, and sales are up 10%”, you can be more targeted and say “feature A caused registration to increase 5% and feature B caused existing users to increase use 10%”.

One of the ways to do this is with a process called “split-testing”. When you put out a new feature, you only put it out to half your customer base. You then measure the difference between the customers with the feature and those without to see if it drives the behavior you are looking for (increased use, increased new users, etc…). If the feature doesn’t have any positive impact, you remove it from the product, or you at least do not enhance it any further.

We don’t have a formal validated learning process at Symplified. Split-tests have to be built into the product, which is a lot of effort, and you have to have a large accessible user population (which is much more prevalent when the company is selling to consumers)  to make the splits test show trending data. I am currently in the middle of gathering data for a comprehensive report on how our customers are using our product and what features they are using. That could be the start of some validated learning. Issuing a customer survey would be another way to collect data. And of course, as Ries points out, talking to your customers is a great way to achieve validated learning about your features. The aforementioned report comes from metrics that our product collects and notes from many conversations with our customers.

The thought process continues and I hope to be able to get more validated learning information into the Symplified decision process as I figure out more and better ways to implement it.

Recruiting Improvements

My post from a few weeks ago talked about the evolution of my recruiting strategy. Since then, there have been more evolutions to make the process better. There have been improvements in two main areas. One on candidate inflow, the second on interview process.

To start, I worked with our consultant and created a much better (and attractive) job description. Then I got the thumbs up to hire technical recruiters to find candidates, which really helped.

The improved interview process is now like this:

–       I review resumes and determine if candidate moves to phone screen

–       Phone screen consists of booking an hour with a senior engineer and me. I do the company and job overview in the first 30 minutes. If we like the candidate, then the senior engineer does a 30 minute technical evaluation. Otherwise, we end the interview after that, and the candidate doesn’t know there was a technical evaluation that normally follows.

–       If we still like them, then we invite them in for an in-person interview

–       The in person interview has been improved so that it is only 2 hours. It consists of 30 minutes with me. Then I bring them to a conference room with engineers on the team, and I stay for the first 30 minutes. Then they have 30 minutes with just the team. They end with 30 minutes with just me.

To the interviewee, there are only 2 steps – a one hour call and a 2-hour in person interview. And everyone gets a really good feel for the candidate – and they get a good feel for the company.

I really like this new process. The thing I like best is that we now have 2 new hires starting September 6. Very needed, as Symplified is signing many more deals and there is a lot of work to do.

Developing Good Recruiting Practices

I have always thought that one of the most important aspects of my job is to hire and keep exceptional employees. I have a recruiting process that has worked well thus far, though a bit time consuming and with a long lead time from beginning of job search to hire. We recently brought in a recruiting expert who is trying to change that process.

To date, to hire someone on my team, I would get resumes from many sources (postings, recruiters, and connections). I would then have an initial 45-minite phone call with possible candidates. A lot of our work is done with customers remotely and I felt a phone call was a good way to vet out their phone-ability. There was then a 30-minute technical screen with a technical member of my team (also over the phone). And last a 2.5 hour on-site visit meeting with me, then the team, and then wrap-up with me. At each stage, of course, candidates were filtered out of the process because they did not “pass”.

I have been trying the new expert’s process lately. Which is she does the initial phone interview, and that includes some basic interview questions and the technical screen. She then sends a transcript of that to me. I then decide whether or not to bring them for an in-person interview, using input from my team. And that in-person interview is a one-hour meeting with the whole team. We should then make a hire / no-hire decision from there.

We have thus far done this for 4 weeks or so, and one candidate made it to the in-person interview (and we didn’t want to hire him). The recruiter’s philosophy is to make the right impression, you want to take as little of the candidate’s time as possible. That an hour in person is all you need to to know whether you want to hire the person, and why use more of your and more importantly their time to figure it out. I like the idea of having to do less, but I question whether we get a good enough feel for the candidate and whether they get a good enough feel for us.

Then I read the following posts by other experts: Ring Noshioka, Angela Baldonero. They actually agree with my philosophy – that a longer interview process is the way to vet the good candidates and to give them a better understanding of the company.

That was enough to help me resist our expert’s advice. But after talking to some of the folks on my team, I decided to condense the two phone interviews to one one-hour call, still conducted by me and a senior engineer. So, although the expert advice is not being fully followed, it has helped shape the process to make it more efficient.

How to say “No”

I’ve got some advice for anyone that needs to tell someone “no”. It’s simple – connect with them first. Know what’s going on for them. Know what the “no” will mean to them. This is true in all areas of life. Take a managerial situation, for example (which seems to be how I’m focusing this blog). Before saying no, bring yourself to your team mate. I don’t mean to go to their desk, I mean inside yourself, meet them where they are in the situation. Exhale and get connected to their role and their request. Know the no, don’t just say it.

This is actually true for all decisions a manager will make. Don’t issue edicts from a higher-than-thou place. Be connected to your team before you make a decision. Before you make a decision, listen to their input. For example, when I need to dedicate a person on my team to a new project, I will usually contemplate it myself and as long as I have enough information, I come up with who I think should do it. I then bring the project to the team, and specifically not say who I think should do it. Usually the team decision is the same person I think is good for the job. If it’s not, I listen to the arguments made and sometimes change my mind. If I don’t change my mind, I exhale, make sure I’m connected to the team, and give the reasons for my decision. And while everyone else may not prefer that decision, my decision is coming from a place of connection.

This empowers everyone and results in the best decisions for the team.