Monday, March 31, 2008

Research labs and innovation priorities in an IT organization

Earlier this month HP announced that HP labs is going to focus on 20-30 large projects going forward instead of focusing on large number of small projects. If you compare the top 10 strategic priorities for 2008 that Gartner announced late last year you would find a lot of similarities even though HP's projects are not necessarily designed to address only the short term priorities. Quick comparison:

HP : Gartner
  • Sustainability: Green IT
  • Dynamic Cloud Services: Web Platform & SOA + Real World Web
  • Information explosion: Metadata management + Business Process Modeling
“The steps we’re taking today will further strengthen Labs and help ensure that HP is focused on groundbreaking research that addresses customer needs and creates new growth opportunities for the company.”

The role of a traditional "lab" in an IT organization has changed over last few years to focus on the growth and value projects that strategically aligns with company's operational, strategic, management, and product innovation priorities. The researchers have been under pressure to contribute significantly to the efforts that are directly linked to the product lines. There are pros and cons of being research-oriented versus product-oriented and it is critical that researchers balance their efforts. I firmly believe that labs should be very much an integral part of an organization and anything that they do should have a direct connection to the organization.

“To deliver these new, rich experiences, the technology industry must address significant challenges in every area of IT – from devices to networks to content distribution. HP Labs is now aligned to sharpen its focus on solving these complex problems so HP and its customers can capitalize on this shift.”
Traditionally labs have been perceived a cool place to work where you can do whatever you want without any accountability towards company's strategy and this poses a serious credibility issues for some labs regarding their ability to contribute towards the bottom line. I agree that the research organization should be shielded from the rest of the organization or incubated to a certain extent to protect the disruption in the ongoing business and allow researchers to focus and flourish in their efforts but eventually the efforts should be integrated well into the organization with the stakeholders having enough skin in adopting, productizing, and possibly commercializing what comes out of a lab. Credibility of a lab in an organization goes long way since the product development organizations largely control what customers would actually use, at least in IT organizations. Many innovations that come out of a lab may not even see the light of day if the research organization does not have credibility to deliver what customers actually want. Innovation by itself is not very useful until it is contextualized with the customers' and users' needs to solve specific problems.

Tuesday, March 25, 2008

Alaska Airlines expedites the check-in process through design-led-innovation

Southwest airlines is known to have cracked the problem of how to effectively board the aircraft and Disney specializes in managing the crowd and long lines. Add one more to this list, Alaska Airlines. Fast Company is running a story on how Alaska Airlines has been designing the check-in area to reduce the average check-in time at the Anchorage airport . This is a textbook example of design-led-innovation and has all the design thinking and user-centered design elements - need finding, ethnography, brainstorming, rapid prototyping, and end user validation. Alaska Airlines visited various different places to learn how others manage crowd and applied those learnings in the context of their problem supported by contextual inquiry of the check-in agents. They built low fidelity prototypes and refined them based on early validation.

The story also discusses that Delta is trying a similar approach at Atlanta terminal. Passengers see where they're going. The mental rehearsal or mental imagery aspects of cognitive psychology have been successfully applied to improve athletic performance. There have been some experiments in the non-sports domain, but this is a very good example. Imagine an airport layout where a security check-in process is visible from a check-in line. This could make people mentally rehearse a security check while they wait for their boarding passes so that they are more likely to complete the actual security check much faster.

What makes this story even more compelling that they managed to satisfy customers by reducing the average wait-time and yet saved the cost and proved that saving money and improving customer experience are not mutually exclusive. The innovation does not have to be complicated. They also had a holistic focus on the experience design where a customer's experience starts on the the web and ends at the airport. Some people suggest airplane-shaped boarding areas to expedite the boarding. This is an intriguing thought and this is exactly the kind of thinking we need to break out of traditional mindset and apply the design-thinking approach to champion the solution. I am all in for the innovations to speed up the check-in and boarding as long as I don't have to wear one of those bracelets that could give people debilitating shocks!

Wednesday, March 19, 2008

User-generated content, incentives, reputation, and factual accuracy

Not all user-generated content is factually accurate and it does not have to be that way. I don't expect Wikipedia to be completely accurate and some how many people have a problem with that. Traditionally the knowledge-bases that upfront requires high factual accuracy have been subjected to slow growth due to high barrier to entry. Wikipedia's prior stint, Nupedia, had a long upfront peer review process that hindered the growth and eventually led to the current Wikipedia model as we all know. Google Knol is trying to solve this problem by introducing the incentives to promote the quality of the thoughtocracy. I haven't seen Knol beyond this mockup and I am a little skeptical of a system that can bring in accuracy and wild growth at the same time. I would happy to be proven wrong here.

For an incentive-based system it is important not to confuse factual accuracy with popularity of the content. If content is popular it does not necessarily have to be accurate. If we do believe that incentives can bring in the accuracy, we need to be careful in associating incentives to the accuracy and not to the popularity and that is much harder to accomplish since the incentive scheme needs to rate the content and the author based on the sources, up-front fact-checking, and not just the traffic which could indicate popularity. Mahalo is trying to solve the popularity problem and not the accuracy problem. There have been some attempts to try out the reputation model for Wikipedia but the success has been somewhat underwhelming. I see many opportunities and potential in this area, especially if you can cleverly combine the reputation with the accuracy.

In reality what we need is a combination of restriction free content creation, fact-checking, incentives, and reputation. These models are not mutually exclusive and not necessarily required at all the times and should not be enforced to all the content across all the users. Guided or informative content tend to be more popular irrespective of the factual accuracy since it is positioned as a guide and not as a fact. The people who are in the business of working off the facts such as media reporters, students working on a thesis etc. should watch for the content that is useful, looks reputable, current, and may be factual but is pure wrong and should go through a systematic due diligence fact-checking process.

Friday, March 14, 2008

Ray Ozzie on service strategy and economics of cloud computing

In a recent interview by Om Malik, Ray Ozzie discusses Microsoft's service strategy and economics of cloud computing.

Desktop is not dead: He reaffirms that desktop is not going away but the desktop needs to be more and more network and web aware to support the computing and deployment needs.

"A student today or a web startup, they don’t actually start at the desktop. They start at the web, they start building web solutions, and immediately deploy that to a browser. So from that perspective, what programming models can I give these folks that they can extend that functionality out to the edge?........There are things that the web is good for, but that doesn’t necessarily mean that for all those things that the desktop is not good anymore."

Microsoft did try to ignore the Internet in the early days and they obviously don't want to repeat the same mistake. The desktop is certainly not going away, but there are plenty of innovation opportunities around the operating system. I am happy about the fact that Microsoft is considering user-centric approach for the desktops against an application centric one.

Economics of cloud computing: There are massive efforts already underway in this direction and we have seen some results such as the Microsoft Live.

"I think we’re well positioned, because we have a selfish need to do these things, and because we have platform genetics. We have the capacity to invest at the levels of infrastructure that are necessary to play in this game. So I think we’ll be well positioned."

This is simply the truth that people need to recognize. The cloud computing is about the scale and the scale requires large investment in infrastructure with no guaranteed near-term return. This is one of those boats that does not seem to have an obvious early ROI but you don't want to miss that boat. Microsoft certainly will be well positioned from the consumer and the supplier sides. They can run the productivity suite and other applications on the cloud and at the same time provide opportunities to the partners and ISVs to author and run applications on the cloud.

"But, we have every reason to believe that it will be a profitable business. It’s an inevitable business. The higher levels in the app stack require that this infrastructure exists, and the margins are probably going to be higher in the stack than they are down at the bottom."

The business value proposition of composition on cloud, the ability to build applications on this platform, is tremendous and that's the revenue stream Microsoft could count on. The profit expectations from the street are inevitable and there is no room for loss but raising the price at the bottom of the stack would increase the barrier to entry and competition at the commodity level would yield thin margins and risk slow adoption. He cites Amazon's strategy to set the price low despite an opportunity to raise it without risking to lose customers.

Cloud computing is not a zero sum game but organizations will be forced to make money somewhere to sustain the infrastructure investment and on-going maintenance and perhaps rack a decent profit on top of it.

Tuesday, March 11, 2008

Bottom-up software adoption – an opportunity or a threat?

I have been observing a trend where business users or information workers become more-informed and educated about the range of productivity software options available in the marketplace and start using them without any help or consent from IT. If taken as a threat IT could potentially attempt to block these applications and users get frustrated and still find a way to work around these restrictions and if taken as an opportunity IT takes the hint and standardizes these options across the organization to speed up the adoption. The later is a win-win situation; IT has beta users doing acceptance testing for them without being asked and IT can focus on more strategic tasks, empower users, and support users' aspirations and goals by providing them with the tools that they need. This trend follows the rule of the wisdom of crowd. If software is good enough it will bubble up. Firefox is a good example of such a trend. The users started using it way before IT decided to include it in the standard machines that they give out to the users.

I can understand why the enterprise applications such as ERP and SCM are not likely candidates for bottom-up adoption since they require upfront heavy customization and are tightly integrated with the organization’s business processes and has complex requirements such as compliance, process integration, workflow, access control etc. requiring IT's involvement. This is slowly changing as SaaS becomes more popular and the applications can reach to users directly and provide all the benefits to overcome the adoption barriers by eliminating the upfront IT involvement. Zoho People is a good example of such an application. Salesforce.com has achieved bottom-up departmental adoption despite of IT’s traditional claim to CRM ownership. Departmental solutions do have drawbacks of becoming silos and make intra-department integration difficult and that could result into bad data quality of due to redundancy and lack of effective collaboration. To overcome some of these concerns collaboration is a key feature in an application that is a likely candidate for bottom-up software adoption. Google Apps is a good example where they introduced a feature that allows users to discover each other and potentially collaborate with across departments in an organization.

The decision-making is tipping towards the information workers and many business users don't necessarily see the needs of some pre-installed on-premise solutions. The cultural shift to embed their personal life with the professional life is also making certain web-based tools their choice. If I am a vendor that finds a CIO sale a bit tricky, I would be very closely watching this trend.

Thursday, March 6, 2008

Blurring boundaries and blended competencies for retail and manufacturing supply chains

Widespread adoption of RFID in the Supply Chain Management (SCM) and Supplier Relationship Management (SRM) systems diminish the boundary between the retail and the manufacturing systems and the respective competencies begin blend as well. Today's supply chain goes beyond adding few more warehouses or trucks. Think about supply chain of a new Harry Potter book and you will have completely different perspective about the timeliness and the security of your supply chain orchestration.

Collaboration capabilities are the key competencies and they become even more crucial when a supply chain disrupts due to an exception. The solutions should have capabilities to handle exceptions. Some of the people whom I speak to in this domain tell me that a system typically does pretty good job when things are fine but when an exception occurs, such as a supplier backs out, people scramble to handle the disruption and an ability of a system to capture unstructured collaborative workflow in the context of the structured data could go long way. People don't want, and don’t expect, the system to make decisions for them. They want systems to help them and empower them to make decisions to make their supply chain leaner and smarter. They want an exception management system.

It would be naïve to say that retailers don’t model capacity. For retailers it is not just about demand and supply but how to optimize the shelf space and that’s part of the capacity modeling equation. The companies such as Demandtec have retail optimization solutions in this domain.

I see supply chain as a capability and not a solution and a well-designed supply chain could help companies achieve their in just-in-time inventory mantra.

Tuesday, March 4, 2008

Open source licenses and its impact on commercialization

The choice of an open source license sparks a debate from time to time and this time around it is about using GPL as a strategic weapon to force your competitors to share their code versus use BSD to have faith in your proprietary solution as an open source derivative to reduce the barrier to an entry into the market. I agree with the success of mySQL but I won’t attribute the entire success to the chosen license. Comparing open source licenses in the context of commercializing a database is very narrow comparision. First of all PostgreSQL and mySQL are not identical databases and don’t have the exact same customers and secondly I see database as enabler to value add on top of it. EnterpriseDB is a great example of this value add and I think it is very speculative to say whether it is an acquisition target or not – the real question is would EnterpriseDB have accomplished the same if PostgreSQL used GPL instead of BSD.

I see plenty of opportunities in the open source software license innovation and over a period of time disruptive business models will force the licenses to align with what business really need. IP indemnification of GPL v3 is a classic example of how licenses evolve based on the commercial dynamics amongst organizations. We can expect the licenses to become even more complex with wide adoption of SaaS delivery models where a vendor is not shipping any software anymore.

People do believe in open source but may not necessarily believe the fact that they have a legal obligation to contribute back to the open source community every time they do something interesting with it and Richard Stallman would strongly disagree. The companies such as BlackDuck has a successful business model on the very fact that vendors don’t want to ship GPLed code. We should not fight the license, just be creative, embrace open source, and innovate!