Archive for the individuals Category

Ecosystems and The Journey From Technical Curiosity to Trusted Infrastructure

Written by on May 20, 2014 in Blog, individuals - Comments Off

How do you take a great new technology and make it mainstream? I get asked this question all the time. While there are many well-known factors, one of the most important—in my opinion—is that technology’s ability to inspire an ecosystem.

The Making of an Ecosystem

Here’s an example. While it’s hard to believe now, there was a time when  file servers were once a technical curiosity. When they first came on the scene they were laughed at by the “real” IT departments as being toys with less horsepower than the microcontrollers running peripherals in their big iron.  While little file servers didn’t solve a problem in the traditional sense, they provided agility and freedom, and enabled a new kind of IT that was closer and more responsive to business.

But despite the fact that these little servers enabled business functions to happen more quickly than the status quo, they did not take off right away. First, existing and new vendors with domain expertise —spurred on by customer demand—needed to deliver important features like management, security and data protection. In other worlds, to move file servers from technical curiosity to trusted infrastructure required a strong ecosystem of other hardware and software companies.

But it didn’t end there. That very same ecosystem  grew up to become some of the largest standalone categories and companies in their own right.

Early Adopters: Inspiring an Ecosystem

Ecosystems do not appear on their own, causing the transition to happen. The central technology must be  so compelling  that the earliest adopters will do whatever it takes to use it. That’s when you know the technical curiosity has a shot at making it mainstream. The early zealots spread the religion, building momentum when things are still immature. That momentum attracts the ecosystem that optimizes, secures, protects and integrates the new technology with the rest of the business world—resulting in the true liftoff. The latter rarely, if ever, appears without the former. However, without building ecosystems, the technical curiosity can remain just that, or disappear, as collateral damage in the wake of something else.

 No, You Can’t Go It Alone

A startup that believes its product can become core to the business of business must ask itself the question, “who benefits because I exist?” They must declare those parties important allies in the journey. These concepts are simple, yet often ignored. To build a better mouse trap, you don’t need an ecosystem, they’ll say. Guess what? We aren’t building mouse traps.

Something old, something new, and a little magic

Written by on November 5, 2013 in Blog, individuals - Comments Off

Today we become a bigger part of the team at Coho Data. Since we have been part of the team from the beginning it doesn’t feel much different. However, the great work done over the past year has excited us to the point of leading the investment in the latest round of financing. That great work is the delivery of an amazing storage product with disruptive performance characteristics and a future enabled by riding the evolution of the old, leverage of the new, and catalyzed by the magic of software.

The combination of old, new and software to make a great product has a pattern of success. The first PCs were just that.  IBM combined a new microprocessor and bit mapped graphics with some inexpensive 8 bit peripheral support hardware and a simple operating system and an industry is born. Coho is combining technology for a similar effect.   The old here is standard off the shelf servers, disk, and flash. We love that stuff since everyone knows what it is and can trust it since it’s already part of the tried and true compute infrastructure we use all the time.  The new is a programmable SDN switch. The software magic delivers a storage system with industry leading performance from single though many nodes with linear scale. And that many is really many.  What this means for customers is the ability to start small while knowing up front that they don’t need to change architectures as their needs change.  Scalability goes both ways (up and down). We often forget that. Of course we are all focused on more and faster.

Many times innovation seems like a cool trick and the Coho architecture clearly is that.  Great performance can usually be achieved by building a new system top to bottom. Coho didn’t do that. It took a few components that were designed separately. Then they effectively created an operating system for scale out file serving. The beauty of an operating system approach is that future performance benefits are accreted to the customer as all of the silos of technology improve on their own.  This is similar to benefits end users get for their desktop applications as things like graphics and system I/O improve. A properly designed operating system doesn’t require massive redesign or end user retraining as hardware innovation occurs under the covers. The user simply benefits. The user benefit also is about choice. A user can decide the level of performance needed and can pay up to higher levels as needed or not. The Coho architecture is centered on the same design philosophy given a set of hardware components that have baseline capabilities.   Of course if you look at the founding team of the company one would expect nothing less.

As proud investors in Coho Data we look forward continued innovation that will benefit businesses of all sizes along all performance and cost curves.  Congratulation to the team on building a product that will change the face of storage. More to come.

Hybrid wins for all of IT

Written by on October 30, 2013 in Blog, individuals - Comments Off

Today CSC (Computer Sciences Corporation) agreed to acquire ServiceMesh.  We at Ignition invested in the company more than two years ago before anyone was really saying much about hybrid anything other than vehicles. Our thesis was that hybrid would win and that ServiceMesh was defining what hybrid IT means.  As THE proud and only venture investor we are excited to see the ServiceMesh vision (which is already deployed in multiple large organizations) now multiplied by the capabilities of CSC on a global basis.  I thought it would be a good time to reflect.

Over the course of the past few years a lively discussion of which is the right or winning cloud type has gone on and on.  We have private, public, virtual private, hosted private, and then the variants of PaaS (public and private) and let’s throw in SaaS just for good measure.  All of this collectively can be described as Hybrid IT or more popularly Hybrid Cloud. I won’t try to link to a definition for all of this because well the definition is just “all of it”.  So the winning model will be “all of it” but working together in a way that at least seems coordinated and coherent.  

I use analogies from the earlier days of IT to draw parallels to where things are going.   I’ve started using the first PC style file servers as the initial examples of hybrid IT.   In the 1980s the PC came into being and we all had islands of storage contained on our PCs.  It was possible to store information “off machine” in the first clouds that were merely telnet or ftp servers or in places like AOL or Compuserv or on a company owned mainframe system (anyone remember Irma). Each of these off machine places had a unique way of getting files back and forth to our PCs and it was way different than the file access we had locally on our PC disks.  So the end user had to know specific commands and had user ids and passwords all that good stuff. The application software (word processing and spreadsheet) of the day didn’t bother trying to figure out how to store files in these clouds. It was the user’s job to move files around and understand the protocols (modem or 3270) and unique commands to store and retrieve things in these places.  So the PC was our private place and these off machine places were our clouds. They were separate and apart. The local area network, the PC style file server, and some clever software changed all that. A software trick enabled disk space on the file server to appear to the LAN connected PC as if it was a locally attached disk. We called this technique redirection at the time but it really was a form of virtualization and namespace management.  The magic enabled application software and end users to use storage in a hybrid way without the user having to think (too much) about where things were.  This hybrid storage between PC and server gave the IT folks the ability to provide real services for their end users while getting some control in the form of data governance and protection. 

Fast forward twenty years and we can talk about all IT services in a similar way but now the dividing line is a company’s network and the outside world of the public cloud and SaaS.  Its way more than the “where it is”.  It’s clearly about how it’s delivered. It’s not enough for IT to simply allow the use of off network resources.  The combined on and off network resources need to be presented and accessed in the same way.  The unit of access and definition is no longer something that looks like a disk drive but rather an entire complex application or a collection of applications and other resources that would look a lot like a datacenter.  The hybrid cloud is all of this together and appearing to act as one.  The challenge is that very few of these components were designed together and nor should they be.  As a result each component has its own way of being used and accessed (ie. AWS compute is different from using an HP  or Dell Server in every way but both run VMs).  Software systems that enable hybrid cloud result in organizational agility by delivering a prescriptive model for incorporating arbitrary private and public components that end up looking and acting like a cohesive whole. Governance and compliance is a centerpiece here.  A complete hybrid cloud framework won’t let users do the wrong thing (like replicate a private database off premises) but it won’t get in the way of users doing their jobs (like enabling prescribed use of elastic computing quickly).  The file servers of old did the same things like enforcing disk quotas or setting read/write permissions on file shares. The user didn’t need to remember what was allowed.

Of course the framework needs to adapt over time to incorporate things it doesn’t know about today but a proper architecture will support that without user and application disruption.  As businesses of all sizes adopt the many kinds of cloud computing resources the only way to realize full potential is by deploying a hybrid cloud system early in the cycle such that a culture of inclusiveness as opposed to silos takes root early as opposed to late.  By doing this the ability to use “all of it” becomes a reality as opposed to something only presented at a conference. When I met the team at ServiceMesh a few years ago I saw the new hybrid IT for the first time. Congratulations to them and our new friends at CSC. Best speed on the journey ahead together.

Content Silos and the Distribution of Personal Business Intelligence

Written by on January 23, 2013 in Blog, individuals - Comments Off

The notion of personal business intelligence may seem a bit odd. The fact of the matter is that we all create, consume, view, and bookmark a large body of content each minute of our online and offline lives. Those of us that work for businesses of all sizes (and that means anyone who gets paid or attempts to) face a growing number of content places and types all the time.   The network we have to maintain in our brains to remind us of the “what and where” is challenging. 

In the “client-server” days we had our inboxes, file shares, internet, and intranet sites.  It was fairly easy to create a map of where things were and people only had a few places they could really create content.  Line of business applications accepted content from people or systems but in very structured ways and, in general, provided fast access to the content since the applications were specific, single purpose, and generally written by the in house IT staff. Everything else was stored on file servers or e-mail and the external internet was merely a collection of websites one could book mark or throw a search engine against if you forgot something.

A few trends together have changed everything:

·         The rise of SaaS as a buy versus build alternative to bespoke line of business applications (, Workday, Concur versus in house apps)

·         The linkage of personal use consumer systems like LinkedIn (social network) or DropBox (sharing) to our business workflows

·         Mobile devices and networks as a preferred consumption and convenient publication point

·         Adoption of rich collaboration systems (Sharepoint and Box.Net)

The silos are not only in different data centers but are from different vendors and the applications all have different search, retrieve, and share mechanisms. The need to get at everything from a mobile device can be challenging not only due to form factor but also because authentication and security when you come in from the hostile public network is different from what happens on your laptop.  Yes I acknowledge the existence of the mobile VPN but can’t prefer it.   And of course let’s not forget that all of these content stores encourage us to create lots and lots of data as we share ideas, opinion, and intelligence in ways that simple file creation, email forwarding, and line of business application data collection didn’t.  And this is all a good thing.

It is good because organizations comprised of people that document and share information openly are better organizations.  Some will say that too much data can cause the proverbial information overload. But on balance, most of the time, more data is better than hidden or missing data.  Also and often overlooked is the fact that the best of new silos make their data available via programming interfaces.   In house line of business applications never really did that meaningfully so the data was captive until the central IT folks got around to “mining” the database.  The strong, behind the scenes structure prevented the free flowing and on the fly sharing that exemplifies the most agile organizations.  New applications will go after that data and connect it.

As we continue our march onto the new business software platform (cloud + SaaS + mobile + data enabled everything), the opportunities for efficiency at the individual, group, and company level will leapfrog the state of the art of just a few years ago.  I hear too many people talk about how we will drown in data.  There is no need to drown. Solutions will embrace the deluge and harness its power.  Core technologies like Map-reduce and NoSQL are just the beginning just like the relational database servers that emerged in 1990s to power the client server movement.  Elastic storage and compute are the fuel.  Fast networking and open data interfaces are the pathways. We’re on our way and I’m looking forward to a new wave of innovation over the top of all this.  A couple of Ignition companies are working in this space now. Both SnapLogic and stealthy TipBit are leading great innovation and have embraced the trend and we are proud to be investors in them.

Dispatch from Under the Radar 2012 – Consumerization of IT

Written by on April 29, 2012 in Blog, individuals - Comments Off

The show had a good line up of companies  once again and one of my investments, AppFog, won the People Choice which is the “hot company” online vote by people attending or not.   The keynote from Yobie Benjamin, Global CTO from Citigroup, had some really good information and its worth watching it on the streaming site but as a startup I would be discouraged if I was trying to sell to Citi.  His pre-req for any conversation is whether or not you will be “around” and he defined that by having a big slug of money of the balance sheet.  It’s a fair statement from the CTO of Citigroup but flies in the face of the show theme.  The Consumerization of IT is driven by IT Professionals in the workplace doing and delivering the kinds of things they do as consumers.  Full stop. I’m not sure anyone called the CFO at Dropbox and asked about the balance sheet before throwing a few photos in there.  I’m not  saying that Yobie is wrong for his business but it’s not consistent with the theme of the show and the movement around Consumerization.  That said, I really liked his talk and I’m super happy he made the time to be there.

The presentation format is pretty tight. Each company gets six minutes plus QA time from the judges.   I thought Peter Yared from CBS Interactive was the best judge in terms of the thoughtfulness of questions.  All of the judges did well and it’s great of them to take time from their busy schedules to do these events.  I do judging and panels as well and to do it right takes effort. You can’t just show up.

One thing that I believe was missing was requiring each presentation to open by telling the audience why the company fits into the Consumerization theme.   Minimally the moderator for each section of the conference could have explained why the category of companies in the section fit into the theme.  I don’t think that just assuming because an IT Pro can use a product by swiping a credit card is enough.  

Strikingly absent from the program were any companies doing hybrid cloud with real management beyond the core infrastructure companies (the infrastructure section) which were well represented by Cloud Scaling,  Piston, and Zadara Storage.   These types of companies need to tell the audience that they enable the central IT folks to deliver IaaS to their business units that “feel” like AWS and/or deliver AWS-like benefits while still offering flexibility of on premises or off plus internal controls.  None of them really made the pointed statement but I do think good listeners might have got the message.  Then again they only get six minutesand they did good jobs of representing themselves in that short time.

The section on PaaS was well done and informative. The companies in the section were pretty mature and included Apparbor, Cloud Bees,  Cabana, and AppFog. In the past, if I had to describe why PaaS fits into the Consumerization theme, I’d have to stretch a bit. But this show got me to thinking about it. I think Lucas Carlson from AppFog had the seed of it though.  He talked about the elimination of “ops” for the developer.  That is, the developer focuses on code and not operating servers and keeping things up to date.  So in a sense,  PaaS in general enables a developer to have an experience that  is akin to the simple store and retrieve of files or photos (like Dropbox). In the PaaS context,  I drop my code in the PaaS and away it goes. The Dropbox user doesn’t worry about load balancing the server her files are stored on or backing them up or doing patches.  The developer using a PaaS gets that experience plus a whole lot more.   You don’t need to work at a big company that has a devops staff that builds and maintains the internet facing platform. The PaaS provider does it. So software development orgs of all sizes can get the same benefit.  This is clearly part of the theme. None of the presenting companies made the explicit point and they should!  And yes they all pay as they go so its consumer like of course!

It is still early days for Consumerization of IT to be woven into company messaging. The smarter companies will not just market to it but believe in it and build for it. They will be rewarded for their efforts as businesses of all sizes will be able to adopt their technology in ways similar to the ways IT Pros and end users do in their daily lives.  IT Pros will bring in tech from small companies that innovate fast, enable low cost of adoption, and deliver a level of simplicity in line with the stuff they use at home.  The central IT folks will adopt the more comprehensive solutions that enable their own organizations to deliver a cloud-like experience to their business users.  That’s what this trend is about. It’s way more than pay as you go.

When you are 10x behind in mobile apps, your tools probably ought to be 10x better

Written by on April 12, 2012 in individuals - Comments Off

As part of “my Windows Phone trial”:, I am going to dig into the developer tools. I’ve written a little throwaway iOS app, and i’ve written one with “Parse”: (super easy!). So I’d like to understand the experience of writing a Windows Phone app.

“App Hub”: seems to be the starting place. Like a lot of marketing-driven websites, there are a lot of words up here, and indices of more words, and pointers to more words. Not a lot of help for me to actually do something — Parse is a nice constrast, sample Parse code on the landing page and a signup button right on the first page which leads to a very simple signup. You can get developing with Parse in literally a minute; not so with App Hub.

Anyway, I followed the pointers and installed the “winphone sdk”: There are some words up here that talk about getting a Visual Studio Express edition and I am thinking, thank goodness, because VS is kind of a beast. Well I was wrong, I seem to have gotten a pretty significant chunk of VS with templates for all kinds of code projects. It actually took me a while to figure out where the templates were for winphone projects, and I actually found several, and couldn’t figure out which was the right one to start with. (I did have a version of VS installed a year ago and uninstalled it, but perhaps it left some residue behind which made my VS Express look more complicated)

So I figure I should “sign up with apphub”: and get a developer account assuming there will be some guidance on what to do next. Well apparently tho that is a hard thing to do. My credit card transaction keeps getting turned down with no explanation. Munging thru forums and trading email with apphub support has revealed that this is a common issue, there is something very off with the Microsoft billing system. People wait for days to get their account approved. I’ve been told I need to use IE9 to sign up, that I have to visit 5 different subdomains and make sure my account information is 100% consistent across all those, that I may just want to give up and try again with a new account. I’ve tried everything to no avail. Oh and the billing site is incredibly slow.

So I struggle on. I have email in to several people for help. But some broad prescriptive advice for MSFT at this point: When you are 10x behind in mobile apps and mobile app developers, you should probably aspire to have tools and a developer program that are 10x easier to use. Some specific ideas:

* Fix billing. I’d argue to get rid of it all together, let any damn fool in the developer program, MSFT needs developers. The billing system has clearly been poor for years, it needs some energy applied to it.
* Radically simplify VS. If what I am seeing is what all developers see, it is too much. Too many templates, frameworks, language choices, etc.
* Make the developer website more about doing, less about telling. Developers should be developing code in seconds and minutes, not hours. They can go munge thru detailed technical material later, get them up and running in a dev environment with sample code fast.
* Melding the above two ideas, look at something like “Cloud9″: Host a dev environment right on the site, require no download or install, let people start coding in seconds. Cloud storage of code so they can pick up their coding anywhere, a cloud-based testing environment (I’m sure some of our portfolio companies like “Skytap” would be happy to help). Make it dramatically easier to get a dev and test environment set up.
* Talk with the “Parse”: guys, they have figured out how to make it super easy to develop mobile apps, solving a lot of the backend issues that many developers don’t need to deal with.

This is just the beginning. I am sure MSFT has plenty of smart folks who have ideas. It is not a time to hold back, I’d look hard at bold steps to really change the playing field.

UPDATE: Some nice folks at MSFT helped me get this solved, but in a nonscalable way. Appreciate the help but doesn’t solve the problem for the mass market.

On the Consumerization of IT

Written by on April 9, 2012 in Blog, individuals - Comments Off

We know we’re in the midst of a full hype cycle when we see conferences and blogs dedicated to a technology topic. This month, I am attending my first startup show with Consumerization of IT as the theme — Under the Radar, on April 26, in Mountain View, Calif. There is even an acronym now, CITE or something like that, and of course some hash tags. But, let’s step back. What does the Consumerization of IT really mean?

A few years ago, I heard my friend (and boss at the time) Mark Templeton, CEO at Citrix Systems introduce the term. It was in the context of a Citrix appstore that enables business users to search for and select applications the same way one would on iTunes (and in the context of BYOD). I thought, “well OK this is pretty cool,” and it was a nugget of truth and value. It would be nice to just “find” my business applications at a well-known and trusted place. Mark is a visionary kind of guy and on the leading edge of thinking and messaging. Naturally this instance was no exception. He was ahead of the curve once again, as at that point in time, the popular press equated consumerization/BYOD as a snooze fest.

In the past few years, I have observed a lot of folks trying to define and characterize this thing called consumerization of IT. It is multi-dimensional in nature, and there are aspects for both the end user and the IT Professional.

End users are more and more becoming, or already are, digital natives. They are totally comfortable with technology and expect things to just work. The notion of compromise really doesn’t exist and products need to unfold themselves naturally before users eyes. They are accustomed to “just using something.” None of them read a comparison whitepaper before deciding to use Google or Bing, Zagat or Yelp, Hipmunk or Kayak. It is an instant decision based on results and feel. Likewise, none of these users went to a training class to learn how to buy a book on Amazon.  It was intuitive; they did it once, a book appeared, and no credit card fraud occurred. Then repeat and boom; success for Amazon and a new way of purchasing books and other goods was born.  I’m definitely not #justsayin. This really is a big deal.

For the IT professional, there is a desire to bring in products and services that mimic their own experience as consumers. It’s easier and less risky, and definitely costs less to get going.  What IT professional wants to call the budget folks, to ask for funds, to buy a few servers so they can try out a business application product such as a workforce management system or expense management product? Then there is the time it takes to install software, figure out data protection, keep the application up to date, etc. Yes, custom in-house applications are still need in-house (topic for another post on hybrid cloud) but packaged software? The IT pro just wants to do a little research, cruise over to a SaaS website and start trying the application. If they like the application, they pop in their credit card number and boom they are a customer. If they don’t like the application, they let the trial expire. No capital is spent and no wasted time on infrastructure aspects that don’t matter. A subtle note here is that the IT pro probably never talked to the product’s publisher or worried about the size of the company that made the application. The user based his/her decision on utility, quality, word of mouth credibility, and voice (like on Twitter) of the producer. It’s no longer about the size of the sales force or the D&B rating. Its all self-service, and there is the growing phenomenon that users are adopting technology from much smaller outfits than ever before.  

The consumerization trend is here to stay. It doesn’t mean that all applications move offsite. It does mean that the products that are on premises and off premises, application or infrastructure, need to adapt and enable agility and frictionless deployment. And, one last point; applications better be accessible from a smartphone. Think differently, said my friend Mark.  It’s nice to see consumerization and BYOD)  happening widely now.

Check out Motif Investing and help me win an iPad

Written by on April 3, 2012 in individuals - Comments Off

I love Motif and if “you check it out I could win an iPad”: OK I will probably have to give the iPad back as we are investors but still I do love Motif, it is worth looking at if you do any investing whatsoever — a much more natural way to invest.

Skeptical about voice control

Written by on April 2, 2012 in individuals - Comments Off

There was a “Sunday NYT article on voice recognition”: and how we are all going to control our TVs and other devices with voice. Building on the Siri wave, there is a popular belief that voice will become a significant or even dominant way we interact with devices and services.

I’m a big believer in voice. Ignition is an investor in “Spoken”: who is doing great with their existing cloud voice processing business, and have some great ideas for the future. We’re an investor in “AVST”:, “Twisted Pair”:, “Public Mobile”: — all voice-based businesses, all doing great. People are never going to get tired of talking to one another.

And that is what voice is really all about — people talking to people, not to devices. I will invest all day long in technologies that improve people talking to people — making it easier, more accessible, cheaper, augmenting with additional services, hosting conversations, etc.

On the other hand, we don’t talk to our tools and instruments. We touch them. A well designed tool or instrument fits the hands naturally, and in the hand of a skilled practitioner allows great creativity and/or great performances. The feedback during its use is important, we are very sensitive to the feedback and can adjust our use in very fine increments. We don’t attempt to use voice which is an imprecise, error-prone method — in fact, trying to talk very precisely can be quite annoying and unnatural.

So are our computational devices more like tools, or more like people? Do we want to interact with them as tools, or as people? My gut says more like tools, and that we will be more effective using touch and gestures than voice.

There are always going to be edge cases in which voice control is preferred — people with disabilities, handsfree situations. But I’m not convinced voice control will become significant.

Great visit to Tier3 this week

Written by on March 3, 2012 in individuals - Comments Off

I had the chance to meet “Jared Wray”: at “Tier3″: (one of our portfolio investments) on Friday and I was incredibly energized by the meeting. Jared is a star and Tier3 has a huge future.

I’m not generally an enterprise IT guy. I’ve worn an IT hat at times, but always for small businesses or small offices. I’ve done some enterprise app development, but eons ago. I’ve worked on software teams that have sold into enterprises and have spent time working on features to support enterprises, so I have some sense of their issues, but I am no expert. So take my views with a grain of salt.

With that caveat — wow have these guys done a terrific job creating a relevant cloud offering for enterprises. It seems super easy to roll apps out to their service because Tier3 supports a huge range of enterprise software with preconfigured orchestration blueprints for setting it all up; they support enterprise security requirements, they understand and provide great monitoring, they provide enterprise SLAs, all while delivering the great cloud attributes like elasticity. And with their new “service provider partners”:, there are going to be a ton of hosting options in locations that work for enterprises, to serve the need to “hug your servers”.

It seems like a no brainer for people to try and adopt Tier3:

* If you are in enterprise IT and want to move some of your apps to the cloud, this seems like the way to go. Or at least consider. And with great “no-cost self service activation”:, there is really no reason not to try.
* If you are a startup targeting the enterprise, Tier3 provides an environment giving you access to the computing environment of the enterprise. Again free to sign up and a pay as you go model, so why not try?
* If you are a service provider and want to provide enterprise grade services for your enterprise customers, a great set of services available for adoption.

We (Ignition) really have to step up and help Tier3 get the word out about what they are doing. They are already growing at a great clip but we can and should help them do more. They need great people in sales, marketing, and product development. And they need trials from customers and feedback.

Very exciting, great to be working with these guys.

Made my first contribution to a Kickstarter project, the Zooka

Written by on February 23, 2012 in individuals - Comments Off

Seems like a nice speaker — “the Zooka”: — (though they probably have a trademark issue to resolve) and I’m glad to support a Northwest project. It is also exciting to see the diversity of projects up on Kickstarter, and nice to see that people are willing to pay for value and creativity. After 15 years of people demanding more and more free content and service on the Internet, any shift back towards sustainable business models seems good. Personally I feel way better about paying for something, rather than getting “free” content and having my attention sold to the highest bidder without my involvement and consent.

MSFT and the decline of the PC hardware ecosystem

Written by on February 18, 2012 in individuals - Comments Off

In the late 80s, IBM attempted to reassert control over the PC hardware platform with the introduction of the PS/2 and its proprietary “MicroChannel”: architecture. The cloners fought back, customers voted with their feet, the PS/2 initiative failed, and the era of open PC hardware continued and flourished. This was hugely beneficial for MSFT as a thousand PC OEMs bloomed, PC-based innovation surged and costs dropped, and MSFT software rode the wave of market expansion.

And it was great for end users. Not only because it drove system costs down, but it also created a rich market of add-on products — everyone could mix and match hardware to create their optimal system, whether they cared about cost or performance or maintainability or upgradability or whatever. Corporations could spec out and build standard low cost machines, enthusiasts could build super-tweaked machines, verticals could build out specialty machines, all on the same open hardware platform.

In the last 15 years, though, the market has shifted dramatically towards the laptop form factor. This shift has been a relative disaster for MSFT. The industry has moved away from an open hardware chassis with mix-and-match components, to closed tightly-engineered all-in-one machines. This shift has played to Apple’s strengths in design and integration and has negated many of the benefits of the PC ecosystem. The PC industry is still struggling to figure out how to regain design and profit momentum — Intel’s “Ultrabook”: effort being the latest scheme. But the Ultrabook is just a direct response to the MacBook, it does nothing to recapture the open hardware experience of the 90s.

The open hardware community still exists in various forms, but is no longer focused on the PC platform and is not much of an asset for MSFT. Enthusiasts still build PCs, mostly for gaming — “Maximum PC”: for instance has a good guide to components, “Newegg”: is the place to buy. But this isn’t mainstream any more. The “maker” community is vibrant but is focused on other platforms largely — “Arduino”:, the “Kickstarter”: community, etc. The vibe and energy around open hardware is great, but it is no longer tied to the PC experience and is no longer an asset for MSFT.

MSFT has always been great at chasing taillights and is hard at work supporting the Ultrabook, competing with the Apple stores at retail, pushing Windows Phone, etc. But chasing Apple’s taillights results in products that are more and more like Apple’s — fully integrated hardware/software/services, a captive retail experience. MSFT has to do all this, the mainstream of the market is here, but there is nothing distinctive about the resultant products and experience. The Ultrabook/Windows/Microsoft Store products may equal the Apple experience, and may offer users a few more choices of hardware brands (does anyone care?), but the experience won’t stand out. Necessary work but not sufficient to recapture thought leadership in the market — at the end of the day, MSFT will be able to claim parity but no more than that.

If I was in a leadership role at MSFT, I’d invest in strategies to recreate the open hardware platform dynamic around the Windows platform. It is not obvious how to do so with the laptop and tablet as the mainstream platform, but I would spend $100s of millions trying. MSFT clearly has the cash to spend on new frontiers and new adventures, a couple hundred million on an effort to change the basis of competition in the PC market seems like a wise bet, even if it fails.

How about putting a “maker’s corner” in every retail store with modified cases and modified machines, maybe even workshops? Get the energy of the PC gaming community into the store, let people see this energy. How can the laptop design be modified to support add on hardware — super high speed optical expansion busses, wireless high speed expansion busses, novel expansion chassis ideas? Sifteo cubes are kind of cool, can this idea be used to provide hardware extensions to laptops? Are there other ways to “snap on” hardware to extend the laptop or tablet, using bluetooth or induction or other mechanisms? Can MSFT seed the maker community with funds or tools? Can MSFT embrace Arduino somehow, or Kickstarter? Could the PC be the hub for thousands of Arduino-based sensors and actuators and gadgets? These ideas are all admittedly poorly thought out, and I am not sure any one idea is right, or if any will work.

But I would spend a lot of money chasing after any idea that would move away from closed all-in-one hardware designs, and I would experiment with many ways to reinject open hardware dynamics back into the PC/tablet market. Ultrabook is not this — it is a fine and adequate taillight chaser, but it won’t shift competitive balance back in MSFT’s favor.

This is not the only reason for MSFT’s stagnation in the last decade, there are many other aspects to consider, but the dwindling of the open hardware ecosystem has been a loss of MSFT. For another take on Apple’s success against MSFT in the last decade, check out “Rich’s analysis”: — the observations about vertical vs horizontal integration ring true.

Enterprise 4.0: IT’s ongoing re-platforming

Written by on February 11, 2012 in Blog, individuals - Comments Off

As we look back at the history of computing, it’s clear that each wave ushered in new rounds of groundbreaking technology that birthed new companies, increased productivity, gave rise to IT, empowered businesses and changed the world. In this blog post, I’ll take a look at early technology waves, reflect on how they changed IT and look at the most recent trends and the opportunities they usher in for innovation.

We are experiencing a number of shifts and a true evolution in enterprise IT. It seems like every day there is a new term being coined and new trends “up and coming.” During these past several decades, a few major waves come to mind and can be identified as ongoing trends:

1. Mainframe / mini era (1959)

2. Networked desktops and client server (1986ish)

3. Browser based and app server (1997)

4. Mobile and “cloud” (2008)

Each of these major trends caused transitions that led to a new way of interacting with technology. I like to call this shift, “re-platforming.” It’s bigger than merely a transformation since it affects the way things are “stood-up” in an enterprise. As this re-platforming is a catalyst for IT, which constantly needs to reposition/rebrand itself to meet current times and the needs of its users.

Mainframe/mini era

I don’t have first-hand experience working with mainframes and mini computers, but I did see them fade to the background with the proliferation of microprocessor-based systems in the 1980s. What makes this interesting is with the arrival of the microprocessor-based PC those in the mainframe/mini industry belittled it for not being a “real” computer. Yet, use of the PC grew and soon it took over jobs previously only done on its bigger cousins, such as data entry and text editing – use cases which opened the door for disruption and innovation, arrived by way of Lotus 1-2-3 and Microsoft Word. The PC no longer needed a reason to be, and it ushered in a completely new definition of a computer.

Networked desktops and client servers

As technology advanced, the PC became more powerful and technologists began looking for ways to improve on their performance by connecting computers together, leading to the development of Ethernet in 1980. Ethernet allowed PCs to be connected together, and soon the notion of networking the PC (client) to a host (server) was born. The idea encouraged openness and commoditized hardware and software and gave rise to the idea that your client could be anywhere; it no longer had to be in the same building or even the same state.

Browser based and application servers

In the mid 1990s, the Internet began to take hold and IT experimented with the idea of using a central server to house an application and using the Internet as the access point to the application. Since networking and server disciplines need to exist as a prerequisite, the build out of client-servers laid the groundwork for the application servers – if your hardware (client) could be anywhere, why couldn’t the application be anywhere? This brings us to the browser and application server era. The growth of programmable web servers and browsers shepherded in application platforms like BEA and .NET, and the broad deployment of software such as enterprise resource planning (ERP) and customer resource management (CRM) empowered businesses to garner more value from their data.

Cloud and Mobile

More recently, IT has begun to ride the cloud and mobile wave. Everyday I see new companies and innovative technologies that are emerging to leverage this trend. While we’re still at an early point of adoption with cloud, it is clear that it is and will continue to be a huge game changer for IT.

Cloud computing encompasses many things and I want to look at both public and private (hybrid) clouds and emerging technologies, which include the development of “as a service” platforms, mainly:

· Software as a Service (SaaS) – the new way software is delivered

· Infrastructure as a Service (IaaS) – think of it as the new server, network and storage

· Platform as a Service (PaaS) the new developer tool stack

These emerging technologies are having a huge impact on IT and are put to work differently based on the specifics of each enterprise’s requirements. On-premise, or private, cloud computing (IaaS and PaaS) is important for enterprises looking to maintain the privacy aspect but still receive the same self-service semantics as public cloud. Public, or hosted, cloud computing enables central IT managers to have the flexibility to broker various services to their business users, saving dramatically on costs and time.

In mobile, as always, history repeats itself. For just as the mainframe computing world belittled the PC, so the PC world belittled the mobile handheld device. Think back to 2007, when the iPhone was first introduced. The cool mobile phone was the small sleek Motorola Razr – and its champions poked fun of the iPhone and called it a brick. And yet, what happened at the end of this year’s Super Bowl? The TV cameras charged into the field as the Lombardi Trophy was about to be awarded. What did viewers see? A sea of iPhones in the hands of NY Giants players as they rushed to capture the winning moment. Yes, the iPhone and the other smartphones it ushered in have won. They are first-class computing citizens with capabilities that laptops lack including location-based services and truly continuous connectivity. Mobile devices as enterprise IT endpoints are no longer the exception but rather the rule.

Users are enamored with their smartphones with always-on connectivity and easy access, visually exciting applications that find their favorite restaurant, or keeping them connected to family and friends. These very same users want these attributes in their business applications too. They want to point (or touch) and shoot, and within seconds have their application up and running. They don’t want to have to enter a URL in a browser – it’s the last resort now. So on the surface one may say “so what” the phone is a micro computer with a little OS and some APIs, just hire a developer to create little applications for that small screen. If only it were that simple. The phone is outside the corporate network completely and the apps need to deal with network latency that would give inside the house app timeouts left and right. The phone doesn’t readily give an end user the opportunity to authenticate with Active Directory and needs its own functionality and development framework, which turns out gave birth to mobile PaaS. The establishment never sees the disruption in its true glory.

What about getting apps on to the device and managing them? Is this an IT function? What traditional PC and app lifecycle management tools are built for this? Ah, the web app. IT moved away from heavy weight apps a while ago and now we’re back to doing that again for mobile. So the net of it is all the challenges equal opportunity where desire is high at the point of attack.

The next big shift

With each new trend disruption followed and brought along opportunity. Opportunity for the next brilliant mind to create a technology that saw the solution to the obstacle. Each trend was accompanied by the creation of new companies and technologies. What disruption will arise from the mobile and cloud trend?

I’m betting on technology born in and enabling the success of big consumer facing properties like Facebook, Zynga and Google as the next shapers of enterprise IT. Examples include NoSQL, Hadoop and social mechanics. Unstructured data is growing faster than structured data and is being mined for business intelligence and productivity. Organizations are attempting to leverage social networks, fascinated by the interaction people have with each other and their ability to rally a protest or crowdsource the facts of a news story. It’s only a matter of time before our yearbook photos and status updates are part of the company directory.

We’re at the start of a wave and new companies are being created on a daily basis to lead the way to innovation. Each previous wave resulted in the creation of great enterprise software companies and we are sure to see this continue. Disruption creates opportunity and the desire to grab the opportunity yields outstanding innovation. At Ignition Partners, we are investing in this latest wave now and will continue to invest in technologies that are addressing the needs of future waves. As operators during the previous waves, and now as investors, we’re excited to be part of it.

Ignition news roundup — Symplified, Whiptail

Written by on January 19, 2012 in individuals - Comments Off

First off, we are surving the 2012 Snowpocalypse. Office traffic is light but folks are here.

On the business front, it was “announced that we led a round”: in “Symplified”: Great company building some pretty essential tools to manage employee identity and engagement across the web, can’t imagine how companies manage their voice and presence without this.

We also “joined the investor group”: behind “Whiptail”:, who build high-scale SSD arrays to replace spinning disks. Spinning disks — seems like we will look back at these in 100 years and laugh, or at least class them as a steampunk kind of gadget.

Excited to work with both companies.

I’d also be remiss if I didn’t note Bluestacks CES award and Splunk filing today

Written by on January 13, 2012 in individuals - Comments Off

Also of note today is “Bluestacks’”: winning the CES best software award, and “Splunk’s filing”: Congrats to both teams on their progress.

Latest Posts

Latest Tweets