How do you recognize a bullsh*t strategy?

One, they are expressed as goals, without saying anything about how to reach those goals.

Two, they are generic and shared by pretty much all the other brands and companies in your category.

Three, they are fluffy and written in such a loose and broad way that there are no obvious actions falling out of it. What does "leverage synergies" mean? What do you do with that?

A strategy is the unique value a business provides to the market.

A unique value is the benefit your customers get from your product, which they can't get anywhere else, and which a hell of a lot of people want or need.

The intellectual content of a strategy - the thinking behind it - is only half the battle. The other half is converting that thinking into a strategy that is actually usable.

So what can you do?

You can put your strategy through the subjectivity test where you remove all subjective language, anything like 'good', 'great', 'world-class', 'best' and 'smart', and see if there are any substance left.

You could also play the opposite game where you ask yourself if the opposite of your strategy also make logical sense. If the answer is yes, then you probably have a good strategy on your hands because it represents a true strategic choice.

PowerPoint or Word?

Most strategies float around in "The Deck". A nice long PowerPoint presentation with a few pillars, onions, missions, visions, and the like. A PowerPoint lets you get away with all the things that wouldn't fly in a conversation or email.

Instead, just write it the way you'd tell it. 

A single page of A4 with a few paragraphs of argument and explanation, culminating in the punchline ("therefore we are going to do X"). Your job is simply to explain it so that anyone who reads it, gets it.

There should be no difference between your written explanation and your spoken one.

Even a super-crisp strategy is still, ultimately, going to be fairly abstract, so it's important you really land the idea (and get the ball rolling) by listing some key actions arising from it.
  • What must you do to deliver on this?
  • What needs to change?
  • What do you need to stop doing?
  • What needs to be added?

If a strategy doesn't prompt ideas automatically then it has a problem - probably one of being too abstract, and not practically grounded enough.

"No Bullsh*t strategy" by Alex M H Smith

Strategy, strategy and strategy

... or shall we call it an action agenda? 

I loved the conversation between Professor Richard Rumelt and Lenny Rachitsky at Lenny's Newsletter.

Goals, ambitions, visions, missions, values, wished-for end states - none of these things are a strategy.

And it's not true that these things have to be in place before you can have a strategy. Strategies are fundamentally about what you’ll do in response to a challenge. Strategy is problem solving, and you cannot solve a problem you don't understand. 

As understanding deepens, the strategist seeks the crux - the one challenge that both is critical and appears to be solvable.

What makes up a good strategy?

A diagnosis of the situation. Figure out what's going on here and understand the challenge you face. The challenge can be to deal with change and competition, it can be triggered by a large opportunity or it can be internal like outdated routines, bureaucracy, or lack of collaboration.

A guiding policy, i.e. what will you do and what will you not do with the challenge. It is "guiding" because it channels actions in certain directions without defining exactly what shall be done.

A set of coherent actions that will carry out the guiding policy. This part is so easy to leave out because people like to think of strategy as a high level conceptual thing. Strategy is about action. There must be enough clarity about action to bring concepts down to earth.

When deciding what you will do with the challenge, find your source of advantage. Do you know something that others don't? Do you have a skill that others don't have? Do you have a reputation, brand or existing market system that others cannot replicate? Do you have scale, technology, experience or other resources that others don't have?

A bad strategy is fluff and fails to face the challenge, it lacks the diagnosis. If you don't frame the challenge it is difficult to assess the quality of the strategy.

Another mistake is to treat goals as a strategy. Many bad strategies are just statements of desire rather than plans for overcoming obstacles. Good strategic objectives are the outcome of a strategy, not its input.

Bad strategy is the active avoidance of the hard work of crafting a good strategy. One common reason for choosing avoidance is the pain or difficulty of choice.

Good strategy requires leaders who are willing to and able to say no to a wide variety of actions and interests.

Strategy is not mysterious. It is about solving the most important problem you are facing. You need to be focused on something doable and be consistent about it.

Wild west to Agile

History's role is not to help us predict the future, but to prepare us for it. 

Today we talk about product management, agile, waterfall, projects, iterative and measures of success. I enjoy putting this in a historical context. Because those who fail to learn from history are condemned to repeat it. History can help us prepare for the future. Jim Highsmith's career over the last six decades is a good start.

Wild West (1966-1979)

Software engineering was in its infancy and the knowledge of how to software development negligible. Software project success was measured by completion and cost. Businesses operated on the premise that the world was predictable and if plans failed to materialize, the problem was execution, not planning. Change was the "unusual state".

Structured Methods and Monumental Methodologies (1980 - 1989)

This was an era where management theory was firmly stuck in the command-control model. IT projects were generally viewed as out of control. Managers assumed building software was similar to building a warehouse. They often knew little about the value of IT systems, even though productivity increased, and focused therefore on "out of control" cost and schedule overruns. 

Structured methods should bring discipline and control to software development.

But the attempt to go from the undisciplined Wild West to a more disciplined, formal approach overshot and focused on the wrong things - processes and documents. 

Formality was mistaken for discipline. Formal processes, phase reviews and documentation added a level of bureaucracy that overshadowed the benefits of structured techniques. It worked for tangible products and construction-type projects though.

The belief was that documentation was enough to convey all the information required by individuals in the next phase of work.

A paper by Winston Royce in 1970 is often credited with introducing the Waterfall model of software development (a term he never used in the paper). However, Royce did not advocate for the  model in its strict, linear form. He highlighted its limitations and risks, particularly the challenge of accommodating changes once the project is underway. To address these shortcomings he argued for feedback loops, overlapping development phases, and the need for more flexibility and adaptability.

But the waterfall lifecycle was greatly influenced by the surrounding ethos, manager's thinking and the serial "hardware" mindset of the times. Departments outside of IT also adapted their work to conform to the serial mindset. For example, accounting started to classify operating (OpEx) versus capital (CapEx) costs.

Too much structure reduced problem solving and innovation; too little created chaos and ineffectiveness.

The Roots of Agile (1990 - 2000)

Executives now cried for help. "It takes too long, costs too much, and doesn't meet our needs".

In response, software developers began experimenting with RAD (rapid application development) methodologies. They assembled a small team, delivered working software every week, and conducted "customer focus groups" every Friday to get feedback. This was about value, quality, speed, leadership, and collaboration.

The 1990s were the decade where software outsourcing to solve technical debt and the high-cost mess began to flourish. The assumption was that less expensive programming could be accomplished with minimal communication. But much of the outsourcing was based on the false premises of predictability and prescriptive practices.

Worse, it stripped internal IT teams of the expertise they would need in the coming Internet upheaval.

The methodology of software development continued to be deterministic and formal in the 1990s, but slivers of RAD and iterative development began to creep in, especially for rapidly expanding Internet applications.

The 1990s were the decade of process for IT. We sought not just perfection in the way we built software, but total predictability. It wasn't enough just to do things right, we also had to say in advance exactly what we intended to do, and then do exactly that.

The Agile Era (2001 - 2010)

Demand exploded. Acquiring and retaining technical talent proved difficult. Many companies didn't have the money or the talent pool to respond to the pressures of the 2000s. IT organizations faced pressure from the high costs. Companies had to innovate and expand their technical and product design capabilities.

Technical debt, which had been growing rapidly, was still almost invisible to business executives. IT struggled to convince business leaders to approve the investments required to reduce technical debt, thereby enabling continuous value delivery.

On February 11-13, 2001, 17 software developers met at the Snowbird ski resort in Utah to talk, ski, relax, and try to find common ground leading to the Agile manifesto.

Many software engineers were concerned about the nascent agile movement. They viewed agile development as a retreat to ad hoc practices of the past. But in reality, teams were highly disciplined, but worked somewhat informally. The emphasis shifted from documentation towards collaboration, replacing traditional formality.

From 2001 - 2004, individual teams received dispensation to try this "agile stuff". Teams focused on iterations, stories, daily stand-ups, backlogs and co-locating teams - but the core technical practices were often bypassed.

Agile projects' success was often downplayed by others: "it was just a small project", "it was a new project", "they didn't have to follow our standards". General comment from Agilists was "We don't need any project management". But the management of tasks rather than people was what they were complaining about.

From 2005 to 2010, success versus failure with organizational implementation of agile development rested with courageous executives who understood agility at their core. Courageous executives thrived on adventure, the corporate kind, and had the ability to sort through a myriad of opportunities, engage others with their enthusiasm, and demonstrate results through action.

Another success factor was that Agilists had enough background in change models to muddle through team-level implementations. At an enterprise level, it made a difference when they received help from organizational change experts.

Do we implement from top to bottom, or from bottom to top? Do we start with a few teams and user their experience to seed others, or do we "sheep dip" everyone? What was the strategy for moving agility up (the organizational hierarchy) and sideways? How do we instill both being agile and doing agile? Whose change model and approach do we use? Agilists were simply not experts in change management.

Other success factors were demonstrated business benefits, practices that appealed to engineers and a manifesto that stated a clear purpose and principles for the movement.

Digital Transformation (2011 - present)

The problem now wasn't scaling agile, it was scaling agility and innovation at enterprise levels.

Remember the lessons from agile projects: To change behavior and culture, you must change measures of success. Many managers wanted agility, but with traditional performance measures (planned scope, schedule and costs).

Over the last six decades, measures of software development success have evolved from completion to customer value. Nothing drives change like the way organizations measure success, but few things are harder to change than those measures.

The Digital Transformation period explored what becoming a digital enterprise meant. Enterprise executives developed digital strategies but found their transition from strategy to operating models lacking.

Enterprises today should be seeking to become digital transforming rather than to achieve digital transformation. Transformation indicates completion of a stage, whereas transforming suggests a constant state of becoming.

More than six decades of change have required organizations to constantly adapt by addressing the following areas:

  • Measures of success - this can drive or restrict change.
  • Being digital demands a better business and technology partnership.
  • An operating model linking strategy to action.
  • Creating a quickly malleable organizational model.
  • Management suited for the digital era.

Preparing for the future

Over the 60 years, have anyone notice the rate of change in the world, in technology, and in business slow down? Change has spiraled constantly upward, not only linearly, but bordering on exponentially.

One conclusion from history is that agility, not agile, should be the goal. Agility is a mindset; agile is a type of methodology. Failure to make this differentiation has left organizations with prescriptive methodologies, while leaders who embrace agility have the best chance of thriving in the future.

Change takes more than a one-day workshop. No one changes deeply held mental models easily. Even when organizations and managers intend to start the journey with the right mindset, many don't realize the magnitude and complexity of this transformation.

The root cause of success is people and their interactions, and the root cause of failure is also people and their interactions.

The product operating model

Many have witnessed failed transformations, but few have witnessed true successes. 

Moving to the product operating model is a transformation. At the end, it's about consistently creating technology-powered solutions that your customers love, yet work for your business. It's about delivering real results. But what do you change exactly?

You change how you build. Products are managed as an ongoing effort - improving every week until it's decided to sunset it. You do frequent, small releases so you know that it's working and how it's being used.

You change how you solve problems. Empowered product teams are given problems to solve and outcomes to achieve instead of a list of prioritized features, projects and perceived solutions decided by various stakeholders.

You change how you decide which problems to solve. A strong product company has a compelling product vision and insight-based product strategy identifying the most critical problems that need to be solved to deliver on the business objectives. Your strategy cannot be to serve as many business stakeholders as possible.

Pushing the decisions and responsibilities for finding the best solution to the problem down to the relevant product team, and then holding that team accountable for the results also drives the need for new product core competencies (that normally takes years to learn). 

Unless you are willing to establish these new competencies, your transformation hopes will likely end here.

For the product team together with the product manager to discover and deliver effective solutions, it is absolutely critical that they have direct access to:

  1. Users and customers,
  2. Product data, 
  3. Business stakeholders and
  4. Engineers (the tech lead as a minimum).

Fight any attempt to place a well-meaning person or cumbersome process between the product manager and these constituencies.

An honest and accurate assessment of the organization’s current situation is essential to any plan to successfully transform. Be realistic, talk to all levels and look for evidence. 

Start with pilot teams volunteering to be on the leading edge of these changes. Develop the skills of the product teams (bottom up) and the skills of the product leaders (top down). Ensure your CEO supports and is a champion of the change.

Transformation is a long game, and for that reasons it helps to have some quick wins. It could simply be a team that has never visited users starts doing do and shares it experiences and insights. 

Constantly beat the drum, evangelize and show everyone the progress being made.

"Transformed - Moving to the product operating model" by Marty Cagan.

Discover the right products

How do you know that you are making a product or service that your customers want?

It’s not only about delivering things right, but also about discovering the right things to deliver. You can't have one without the other.

Discovery is continuous. At a minimum, weekly touchpoints with customers by the team building the product where they conduct small research activities in pursuit of a desired outcome.

Customers don't always know what they want, and what customers ask for isn't always what they need. Don't ask them what you should build. 

Ask them to share specific stories about their experiences. Avoid direct and factual questions because we struggle to answer them accurately.

The purpose is to discover and explore opportunities, i.e., what needs, pain points and desires matter most to this customer? 

The Opportunity Solution Tree (OST) is a framework for continuous discovery and a simple way to visually represent the paths we may take to reach a desired business outcome.

The opportunity space represent customer needs, pain points and desires that, if addressed, will the drive business outcome. 

The solution space represent solutions addressing the opportunities, and rather than testing solutions we test assumptions that need to be true for our solution to succeed.

A visualization and a tree structure helps building a shared understanding, it helps you break large opportunities into a series of smaller ones, you avoid "whether or not" decisions, it makes it easier to summarize your work to stakeholders, and it makes it easier to prioritize.

Product strategies happens in the opportunity space. Prioritize opportunities, and not solutions.

To test assumptions you need to generate assumptions. You can imagine that the solution already exists and then map out each step users must take to get value from it. This forces you to be specific and it forces you to make desirability, viability, feasibility and usability assumptions.

You can't test every assumption. You need to prioritize and to prioritize you need to identify the riskiest ones. How much do we know about this assumption, and how important is this assumption to the success of the solution?

When testing an assumption, be specific with your evaluation criteria upfront. 

The team must align around what success looks like. Don't throw spaghetti at the wall, hoping something stick. Remember, you are not trying to prove that the assumption is true. You are simply trying to mitigate risk, and stop when you have mitigated enough.

"Continuous Discovery Habits" by Teresa Torres