The fact that Twitter, Twitter Lists and Google Wave exist warms my heart. They are tools that generate their own innovation buzz eco-system and drive what this blog is all about: Future Business. Foundational tools like these, along with open source projects, are the essence of the web2.0 innovation renaissance. Think about how fast tools and processes can iterate today to match widescale and niche user needs compared to where we were 10 years ago.
At the moment it is the wild west for these innovation eco-systems. Everyone thinks they have a good idea and they are running full-speed either with a little bit of money or completely bootstrapped. Over time, we will start to settle on some valuable use cases and the real money will head in that direction.
As an innovator interested in new ways that business can operate, both tools’ potential fascinates me. While Twitter lists is pretty much what I expected it would be, Wave did not live up to my initial expectations. I’ll give both a fair shake over a period of time because, like Twitter itself, there is likely a path of use evolution. The truly valuable use cases might not show themselves until 3rd party apps have been written that run on top.
For Twitter Lists I am starting to see
- Lists that you are in can be a crowd-sourced social descriptor of what you tweet about
- Curating a popular list gives you credibility as a networker in the space that list covers
For Wave I think we are going to need tools and agreed conventions which
- Help us collectively “garden” (manage) waves. Waves have structure and are objects intended to grow over time. Because they become more complex over time, they need constant management in order that they are accessible to newcomers and previous visitors/contributors alike.
- Help us find portions of waves that are relevant to our needs and re-use those elements in our own content spaces: other waves, blogs, etc…
Long live the companies that are thinking about how to start the next innovation eco-system.
With the economic downturn, businesses will be looking for ways to control costs. Will that mean more centralization? How will more centralization conflict with the web2.0 trend towards lighter more de-centralized apps. Will the lighter and more niche software/hosted products find that funds from business units are drying up?
Popular Knowledge Management expert David Gurteen spoke about this tension even before the financial crisis on his blog. Larry Hawes has a recent blog post discussing Autonomy in Collaboration.
I have had several discussions in the last few days with a VA-based firm currently struggling with the issue of centralization. Historically, their business has been run as autonomous business units free to operate in the way that best meets their customers’ needs.
While I am sure this firm has always struggled between centralization and de-centralization, the web has exacerbated the problem and web2.0 has taken it to yet another level. With the advent of the web, each unit began developing a channel to interact with their customers. A few years ago when the company wanted to upgrade to a more web2.0 approach which allowed their customers to contribute more in the online environment, they sought to build a common platform.
However, the autonomous business units lived on. Because they are quite independent, they are constantly seeking to diverge in order to meet the specific needs of their customers. At the same time IT continues to work towards increased centralization. As you can imagine, this is creating some tension.
A service oriented architecture (SOA) with shared web services and appropriate SOA governance might be their salvation. If IT can control the main architecture and help facilitate the sharing of approved web services, this firm may be able to get the centralization they need while allowing for business units to meet their own customer needs.
Is SOA the way that this increasing tension might be relieved in many organizations? I doubt IT is going to give up working towards standardization and cost savings and I know that if business units feel their customers are not being served by what IT is providing, they are going to continue pushing for autonomy. If not SOA, how is this tension going to be resolved?
Web2.0 is all about contributions from end-users aka. The wisdom of the Crowds. Prediction Markets are a very efficient way to consolidate that wisdom which can be found in various nooks and crannies. If this concept is new to you, read Wikipedia for a primer. If not, read on.
Prediction Markets always receive a huge surge around election time because studies have shown that with enough activity they are more accurate predictors than polls. This year InTrade.com went very mainstream getting mentions on numerous news reports and getting link outs from Yahoo and other mainstream online information providers.
There was a move by DARPA in 2003 to use a prediction market to predict terrorist activity. While ideally that was a good approach to help understand the threat at any given moment, someone forgot to consider that most Americans don’t currently understand the concept of a prediction market and so just heard, “We are going to be betting on terrorist attacks”. You can imagine that program was shut down quickly.
So, will Prediction Markets maintain their momentum post-election this time around? I certainly hope so. We have learned from the worldwide financial crisis that we need people who make decision to have more information at their disposal rather than less. BI, Balanced Scorecards, and, yes, prediction markets should be getting attention/funding from those who feel that their organization is being led too much from “the gut”.
There is no doubt that Education is critical to the future of the United States economy. Thomas Friedman talks about it extensively in his very popular book, The World is Flat. Therefore, the concepts in this post on Future Education are not only directly applicable to business, but our success in Future Education will have a direct impact on our abilities in Future Business.
I spent 90 minutes last night on the phone with an excellent visionary from the NYC Dept of Education. Arthur VanderVeen is focused on how best to achieve knowledge sharing for NYC educators.
We talked about the difficulty of turning tacit knowledge into explicit and we talked about the challenges of fostering active communities of interest/practice.
Two of the main tenets of our discussion were
- Give them what they want: the sharing needs to have value to the way they work today or want to work today. There are some technologies (eg. Computer, cell phone) that completely change the way we work, but most enhance the way we already work in a more evolutionary fashion rather than revolutionary
- Work bottom-up rather than top-down. Try various programs in schools and see what works. Where there is success, invest more to work out if it can be scaled up.
One thing that has come to mind since our discussion is the 100-10-1 rule of community involvement. In the case of education it is probably 1000-100-10-1 due to the challenge of getting already overworked educators to even view information.
- For every 1000 educators
- 100 will actively or passively browse the knowledge-base
- 10 will comment on or use existing content
- and 1 will contribute something new
That means that for 80,000 teachers you may only have 80 contributors. This is likely not sufficient volume to create a critical mass of content that keeps the 100 coming back and gets more of the 1000 to view. The larger districts may decide to invest in “librarians” who seek out good content and take the time to get it into the knowledge-base, but this is not the most efficient model and is probably not tenable for the smaller districts.