The True Cost of Losing Employees

There are always costs associated with employees leaving a company. Every company, assuming it is around for long enough, inevitably has to deal with this issue. Final payouts, recruiter fees, and a loss of domain knowledge are the most obvious costs associated with losing employees. However, there are hidden costs involved related to time, morale, and opportunity.

Time is an underrated cost when losing an employee. It takes time to find, interview, and train a replacement. That is time which would have otherwise been spent on implementing a killer new feature, paying down technical debt, or designing the next iteration of the product. Instead, the team has to spend time learning how to integrate a new individual and find a new tempo of operations.

Team morale is one of the trickier things to gauge. It can't be quantified, yet a good leader knows when there is something  lacking in the team. Any time a company loses an employee, there is an impact on the remaining team members. Depending on how the team felt about the particular employee that is leaving, this impact can be relatively minor or it can be catastrophic. In the worst possible case, the departure of a key employee could result in an avalanche of departures. At the very least, the remaining team members may have a lack of focus for a while as they adjust to the shock.

Opportunity is, in a way, the culmination of the previously mentioned costs as well as perhaps the most deadly. Companies must continue to produce time and time again if they want to survive in an increasingly competitive world. When an employee leaves, there is a danger that the company will miss out on opportunities that it might have otherwise had. Whether that is because 'knowledge walked out the door' or the team is distracted/unfocused, the end result is the same: missed opportunities to become more competitive or increase business.

Folks, the best way to avoid losing employees is to keep them happy. You can do this by making sure that you treat your employees well, give them challenging and satisfying work, and ensuring that compensation is never a problem. An excellent team is, of course, an essential part of a successful business. Lose your team, lose your business.

Pay for Your Apps, Folks, or We All Suffer the Consequences

Gentlemen! , available on the iTunes App Store and Google Play, is the latest example of how tough the app development business can be. The app has received some very good reviews for its unique style and gameplay. According to Killian Bell at Cult of Android, the developers of Gentlemen! have noted that the game "has over 6,000 players on Android". Sounds great, right? 6,000 players is a nice number for the early days of an app. The problem is that of all those people that have played the game, only 50 people paid for it. 

Let that sink in for a while. Fewer than 1% of the people who downloaded and played the game were paying customers. To put it another way, over 99% of the game's players were freeloaders. 

As I've noted before, creating an app isn't necessarily the path to riches. However, this is ridiculous. The game is priced at roughly $3 in both stores, which isn't a large amount of money by any measure. There really isn't a good reason for this game to be pirated so much when the price is low and the quality is high. By not paying for the game, the message sent to the developers is that it either isn't worth their time to develop the game or that they must employ the sleazy techniques used in many freemium games.

Folks,  we all want to play good games. The best way to ensure that new good games are created is to pay for them.

For Developers, There is Buy-In and Then There is "Buy-In"

There is a strange phenomenon common to software development projects that shows its ugly head time and time again in a developer's career. It is a phenomenon that at worst will destroy a project (and sometimes a company) and at best will result in disgruntled developers. This phenomenon of which I speak is called "buy-in". 

To clarify, I'm not referring to the traditional sense of buy-in, where parties involved in a decision not only agree to that decision but provide their support for that decision. Instead, I'm referring to "buy-in" in cases where a single party (usually in management or some other decision-making body) makes a decision and then applies heavy pressure to attempt to get the affected parties to agree to and support that decision. Thus, "buy-in" can be considered a form of fake or faux buy-in.

This problem usually manifests itself in the form of scope creep, schedule changes, or unpopular architectural or business changes. If you've ever been asked to 'take on one more thing for this sprint' or been pressured into fitting your development estimates into a particular prescribed timeline, then there is a very good chance that you've experienced a form of "buy-in". It seems as though the primary reason behind management striving for this faux buy-in is either some lame attempt at 'rallying the troops for a common cause' or an attempt to make it more difficult to pin blame on a specific person (after all, 'the team bought into the decision'). 

In any case, this sort of thing is not healthy for a development team. True buy-in is healthy since the team is involved and committed to something that they believe can be successful. "Buy-in" is unhealthy since it is something that the team does not believe can be successful or can only be successful via death march conditions. An unmotivated and overworked team is not a recipe for success.

Folks, be cognizant of when you are being asked for your buy-in to a decision versus being asked for "buy-in". 

'Designing for a Touch Screen'

The Penny Arcade - Extra Credits series of videos on game design and game culture topics are generally quite excellent. This particular video covers some of the challenges and follies when designing games for a touch screen.

Perhaps the most interesting statement in the video is the idea that having a 'virtual joystick' on a touch device is an example of poor design. I have to say that, in general, I agree. I most often see this type of design in games that try to emulate the traditional platformer style (e.g. 'Mario' style games). This type of game doesn't really lend itself to a touch device since the virtual joystick is much more error-prone than what would be acceptable in a platformer. Typically, the best way to handle a platformer type of game is to create an 'endless runner' type of game where the controls are simplified to a single touch (such as in Rock Runners) or at most two different touch spots (such as in Punch Quest). Perhaps the only good example of a virtual joystick is in Zombiegal Kawaii, and I think that really only works because it is a shooting game (instead of a platformer) where the lack of precise controls isn't quite as apparent.

Folks, watch the video in the source link.

The Problem with Collecting Statistics to Measure Developer Productivity

A common thing I've seen in my time as a developer is that there will often be an attempt to measure developer productivity. While I can understand the desire to gain an insight into how a project is coming along, with very few exceptions it turns into an exercise in futility. The problem with most methods used to try to quantify developer productivity is that they are akin to measuring a portrait artist's productivity based on the number of brushstrokes completed.

One example I have seen is to measure productivity based on the number of lines of source code, typically measured in the thousands (KLOC). The basic idea here is that it is possible to take a comparable software project (with a known number of lines of source code)  and compare with where the current project is in terms of the number of lines of source code, thus establishing an estimate for how much effort remains. There are several problems with this approach. The first (and most obvious problem) is that it assumes that two software projects are truly directly comparable. Unless this is a situation where the same 'cookie cutter' projects are being produced repeatedly, it is quite rare for two different software projects to be close enough in complexity, staffing experience/talent, and budget/schedule for lines of code to be a useful measure. As well, measuring lines of code doesn't account for differences in developer's abilities (e.g. two developers may write the same feature using a very different number of lines of code), nor does it account for differences in the number of lines of code that would be written in different programming languages.

Another example I have seen is to count the number of code checkins to the source code repository. This is a particularly baffling statistic to measure, since it cannot be correlated to developer productivity. Even after accounting for checkin standards, there is far too much variability in how software is developed for code checkin counts to be meaningful. One feature may require significant changes to a single file, while another feature may require trivial changes to multiple files. As well, there is going to be variability in how developers checkin their code, even with standards established. Some developers will use as few checkins as possible (collecting changes into larger chunks) while some developers will produce many more checkins (with each checkin having a much smaller set of changes).

These are just a few of the kinds of measurement follies that exist out in the wild. If these types of measurements (and others like them) are problematic, then what sort of measurements (if any) are useful? 

Quite frankly, the best measurement of developer productivity is how useful the software is to users or how much enjoyment they get from using the software. Everything else isn't really that important. Certainly, there are considerations for timeliness and remaining within the budget, but the questions around those topics aren't going to be answered by trying to measure developer productivity as if software was produced on an assembly line. What works for measuring productivity on the factory floor does not work in the context of a development team.

Folks, don't let your managers try to measure your productivity using methods best used in other professions.

Don't Be a Jerk at the Office, Folks

If you've worked at an office for any lengthy period of time, particularly a software development shop,  the odds are very good that you've run into them . You know who I'm talking about. There are many different names for them such as curmudgeon, know-it-alls, divas, etc. No matter the particular term used, they have similar qualities including--but not limited to--generally surly demeanor, an inability to empathize, and freaking out when things don't go their way. I'm talking, of course, about jerks .

Strangely, the software development profession tends to attract a disproportionate number of jerks. There are many reasons for this (which could span a whole post). In any case, this type of individual can cause teams, departments, and sometimes entire companies to be dysfunctional. How do they do this?

The first way is by having negative attitudes. Jerks can cause the mood of the room to shift merely by their presence. Their surly demeanor can easily wipe the smile off of someone's face. In some instances, other employees can go into 'defense' mode immediately because they have become accustomed to having to fend off the jerk's communication style. 

Another way that these individuals can cause problems is through their inability to empathize with others. This can manifest itself as a tendency to 'throw people under the bus'. Rather than trying to understand where others in the organization are coming from and the pressures that they are dealing with, jerks are often focused on only the task at hand in their particular area of responsibility. When things go awry (as happens in many development projects), these individuals are quick to cast blame on others and rarely (if ever) accept responsibility for their mistakes (especially publicly). This leads to a culture of distrust in a company.

The final major way that jerks can cause trouble for a development shop is by freaking out when things don't go their way. This manifests itself in various ways. The most obvious way is by unnecessarily turning the smallest disagreements into 'earth-shattering' events.  A more subtle way that this manifests itself is when the individual decides to 'go rogue' and make decisions without consulting the appropriate parties. These reactions not only cause unnecessary confusion and concern in others, but also have the effect that the loudest mouth in the room (or the most stubborn one) will have undue sway on technical decisions or project matters. Coworkers will simply not have the energy to deal with the jerk and will let things slip past them that otherwise would have been dealt with in a healthy environment.

Folks, nobody wants to deal with jerks. They are a big factor in workplace harmony and job satisfaction. Don't be a jerk, and don't let jerks take over your office. 

 

Conferences are Important, Folks

Previously, I wrote about the importance of developer perks, and conferences are an important perk. To be sure, there always are costs associated with attending a conference. Often times this involves a financial cost (e.g. ticket price, airfare, lodging), but there are other costs as well such as opportunity costs due to time away from work that must be managed. Whether you attend a large conference such as 
Apple's WWDC (Worldwide Developers Conference) or a more intimate conference such as Cocoaconf, the benefits of attending a conference can far outweigh the costs involved.

The most obvious benefit is the ability to enhance critical skills. Many conferences understandably focus on this aspect as a major selling point, and sometimes increase the skill enhancement by offering pre-conference workshops. However, the advent of live session streaming (or at least the quick posting of session videos to the conference website) has caused this part of the value proposition to diminish. That's not to say that skill enhancement is an unimportant part of the conference experience, just that it isn't the most  important.

Why, then, is it important to attend conferences? Two reasons, really: networking and exposure to new ideas. 

Networking is an often undervalued aspect of conference attendance. It is, however, something that cannot be done while watching session videos at home. Meeting people at the conference can lead to new business connections, new business or job opportunities, and new friends. In addition, the social interaction before sessions, between sessions, at lunch time, and after the sessions end for the day is when much of the learning takes place. While you are in a session, your mind is still absorbing the content as it is presented. The non-session time provides an opportunity for you to collect your thoughts and to share those thoughts with other attendees. This will often be a good way to 'break the ice' and to clarify things that may not have been immediately clear during the session.

Exposure to new ideas is quite possibly the best reason to attend a conference. Sure, new ideas are hypothetically only a quick web search away. However, meeting and having conversations with new people is by far the quickest way to be exposed to ideas that would not have otherwise crossed your mind. You could learn about a new technology or technique that will solve a long-standing problem you've been having, or you could be inspired to move in a whole new business direction.

In a way, conferences can be considered as great examples of concentrated serendipity . Where else can you enhance your skills, make new contacts, and discover new ideas in only a few days? Just make sure you can get a ticket, folks.

Your List of Features Should Be Like the Menu From a Fine Restaurant

Odds are good that as a software developer you will either be tempted to create software that does anything-anyone-could-ever-want, or you will work on a project where the client/customer/product owner asks for features that will lead the software down this path. You start out making a software product that is simple and understood, but by the time it's all said and done you've created a software product that is complex and incomprehensible. It only takes a few rounds of feature creep to turn your simple 'checklist' app into a full-blown 'business process management suite', folks.

​There have been many examples of failed software projects where a major share of the blame in those failures can be traced to a massive feature list. Why does this happen? There are many causes, but one major reason is that software, by its very nature, is very malleable. Unlike building a bridge (for example), it isn't necessarily obvious when a feature request is absurd. Therefore, there is a higher likelihood that unnecessary or problematic features will be added to a software project. It would be incredibly unlikely for a bridge-building project to change scope from handling pedestrian traffic to handling fully loaded semi-trucks after the first bit of concrete has been poured, yet in the software world it is very common to turn small, single user applications into multi-user behemoths.

Have you ever paid attention to the menu at a fine restaurant? Unlike the bloated menu board at a fast food restaurant ​or the multi-page menu at a franchise restaurant, the odds are very good that the menu at a fine restaurant is going to be limited to a single page and no more than 10 to 15 items. Why is this? For one thing, the fine restaurant predicates itself on providing an excellent dining experience. A key part of that experience is that the food must not merely be competent, but great. Fast food and franchise restaurants are trying to be all things to all people, and are not very good at many of the things they produce. The fine restaurant, on the other hand, focuses on perfecting a very small set of menu items. Software is very much the same in this respect. Think about your favorite apps; do they try to do many things or do they focus on a single use case?

Folks, when you add every feature under the sun to your software, then you're pretty much creating a disaster like the car that Homer Simpson designedHowever, when your software's list of features is like the menu from a fine restaurant, life is delicious.

Developer Perks are Important, Folks

Eric Spiegel in an article on Datamation.com:​

Now it was Frank’s turn to roll his eyes. “Whatever, Shaun. Sipping soda helps keep me in rhythm while I code. It’s hard to explain–it’s like a part of my creative process. Security guards and nurses don’t need to be concerned about their creative juices.” I interjected, “I don’t know about that, Frank, but I will tell you that this new policy likely is just the beginning of changes we won’t like. It’s a sign that things are changing–and not for the better. This isn’t a startup anymore. I’m sure the latest investors are trying to squeeze out as much profit as possible so we can go public or sell the company. These changes are clear signs that the culture of the company is changing right before our eyes. “

In his post, Spiegel ​covers his experience at a company that had some fairly standard snacks & beverages perks at the beginning but over time cut the perks and related 'non-essentials' such as training and conferences. Predictably, reduced pay raises and layoffs followed.

Perks such as snacks & beverages are not necessarily essential to a nice work environment, but they can act as a weak indicator of a company's health and/or management attention to making developers comfortable. The addition of new perks signals that management is maintaining a keen interest in creating a comfortable work environment and that the business is healthy enough to financial support such a move. In contrast, the removal of existing perks signals that management no longer cares about its employees or that the business can no longer financial support the perks.

Remember, folks, ​perks don't have to actually cost the company any money. While snacks & beverages are relatively cheap compared to the return on investment in terms of developer goodwill, they do in fact cost the business money. What doesn't cost much (if anything) for a business is letting developers work from home. People appreciate the flexibility that working from home offers, and developers (as part of the creative class) often need 'heads-down' time that is relatively free from distractions so that they can be productive. (Of course, make sure that you don't botch the work from home program.)

​Perks are important for a business to maintain top talent. Making your company attractive to developers is almost as important as making your company attractive to your customers. After all, without talented developers, how will your company deliver on its promises to your customers?

You Must Know What Your Project is All About

Brian Welcker, writing for his own blog Direct Reports, had a good post about knowing when a project is headed for trouble. The lens for his post is his experience on Microsoft's file system project named WinFS. The best part:

I suspected from early on that the project was doomed to failure. What made me think this? Because when I would ask anyone involved with the project the question "What is it?", I would get a very different answer.

​If your project's goal can't be answered relatively consistently or relatively easily by the people involved in the project, then you're probably in trouble. This is indicative of either having a project that is too big/unfocused/gnarly to be completed successfully, or a project that is too ill-defined for everyone involved to be effective in delivering on its goal. I've been involved in both types of projects, and trust me it isn't pretty.