Performance optimization

By measuring solutions’ performance we understand how to to optimize the same. I am a strong proponent of bounding every effort to a success metric of one sort or another as doing so makes outcomes more purposeful and consequently successful.

Future growth and viability of a solution depends directly on how flexible and scalable it is. Recognizing that there is no ultimate solution and allowing for iterative performance optimization through testing data driven assumptions and implementing feedback we evolve the experience of users and with it solution potential also.

With extensive experience in A/B Testing, user research, collecting and consolidating feedback into meaningful business actions and strategies, my performance optimization service offers a way to innovate and break through your local maxima.

High impact, low investment (and risk)

For some odd reason increasing the volume of traffic seems to be the predominant way to address growth in revenue. I don’t get it. Why spend tons of money on paid search networks or countless manpower required for SEO, social and content strategy development in some hopeful attempt to force additional value instead of focusing on improving the experience of existing users, allowing for organic growth.

When development of hypotheses is initiated I immediately look for frictions to be eliminated or opportunities that may be utilized in order to provide maximum positive impact on your objectives.

Iterative performance optimization strategy

Cradle to Cradle ® optimization strategy

Turning friction into traction

Identifying a performance concern such as a drop off point in a funnel, high abandonment rate, caused by distraction, lack of usability, commitment, perceived convenience etc. is met with an in depth analysis of individual interactions user makes with the solution as to pinpoint where and why is the problem occurring.

Appraising effort required for implementation of potential solutions to said problems against the impact they may produce informs us if same is worth pursuing. Once the test has been deemed viable generalized form of resulting causal assumption is

Doing A will change the experience from B to C, resulting in solution interacting with the user in such a way as to achieve X rather than Y.

Where objective X will be targeted through an iterative optimization cycle to increase performance levels of a given indicator.

Good, but it could be better

Aspects of the solution which are performing at satisfactory levels must be given due significance in the testing prioritization. In some instances it is worth arguing that increasing successes is more viable than addressing failures; as some obsolete features may be eliminated at any time.

Streamlining experience engagements and therein contained interactions by seeking answers to problems such as how to increase value provided or reduce effort required in completing the same creates an inductive proposal

Targeting A by enhancing or prioritizing the existing experience with B, solution will provide for C as to improve A.

A is then put in a loop where further variation and segmentation of desired impacts provides for…

…incrementally increasing returns

All it takes is one “a-ha” moment to recognize that increase in value of your offering may be achieved by eliminating redundancies. I am not advocating to “do more with less”. It’s more appropriate to say do less with less. Who want’s to do more? What a silly idea.

Note that implementation of exclusions (ha ha) is best executed on an incremental scale so as to provide confidence once validation occurs.

A/B testing incentives

Segmenting incentive responsiveness traffic allocation

Iterative performance optimization

By making data driven decisions in combination with user feedback we positively impact the return on investment of optimization and also the quality of the user experience.

There is a false correlation between good conversion rates and good experience. Surely some sort of connection may be drawn between usefulness (not to be mistaken with usability) of a solution and the extent to which people are satisfied with using it, but it’s not always as direct as one might imagine.

Conversion rates show us easy to understand and quantifiable business metrics that stakeholders can get excited about. However, they do not tell us if users are satisfied with their experience, nor do they tell us of an impact a certain optimization has had on the use life cycle of a solution. Hence, optimizing for conversions is not the same as optimizing for experience purposes.

The next level

Once a point has been reached at which the current experience has peaked in performance, all further testing will yield very little impact, if any when appraised against required effort. It is as effective as it’s ever going to be within the current framework.

Now that all low hanging fruit has been picked it is time to restructure through innovation by reappraising the experience model. Necessary changes are much larger than A/B testing, and taking risks becomes required. Acquired wisdom plays the role of elevating beyond the current iteration to create assumptions of next.

Sustainability considerations

By initially identifying friction points that are stopping users from engaging with the product or service, we ensure accessibility and establish grounds for further enhancements through testing. While optimizing we are also integrating flexibility through modularity – giving the solution the ability to quickly pivot its objectives and adapt to emerging opportunities as well as shifting user base.

Each product, service and their users are unique and deserve a unique approach, but most commonly they can be significantly optimized or realigned at very little cost and effort within a feasible strategy framework.

Considering and combining iterative optimization efforts with additional services will magnify the impact generated. Heuristic evaluations which examine users experience through gathering direct feedback will help identify and most accurately define the opportunities for testing. Granted that optimization of solutions, in most cases at least, is a post production consideration, it can be a missed opportunity if initial assumptions and future optimization opportunities are not provided for in inception stages of user research as well as experience architecture.

 
 

Let's work together

If you have a project that will benefit from my experience and are impressed with what you see in my portfolio, contact me to discuss how I can add value to your team.