Archive for the Blog Category
Today we become a bigger part of the team at Coho Data. Since we have been part of the team from the beginning it doesn’t feel much different. However, the great work done over the past year has excited us to the point of leading the investment in the latest round of financing. That great work is the delivery of an amazing storage product with disruptive performance characteristics and a future enabled by riding the evolution of the old, leverage of the new, and catalyzed by the magic of software.
The combination of old, new and software to make a great product has a pattern of success. The first PCs were just that. IBM combined a new microprocessor and bit mapped graphics with some inexpensive 8 bit peripheral support hardware and a simple operating system and an industry is born. Coho is combining technology for a similar effect. The old here is standard off the shelf servers, disk, and flash. We love that stuff since everyone knows what it is and can trust it since it’s already part of the tried and true compute infrastructure we use all the time. The new is a programmable SDN switch. The software magic delivers a storage system with industry leading performance from single though many nodes with linear scale. And that many is really many. What this means for customers is the ability to start small while knowing up front that they don’t need to change architectures as their needs change. Scalability goes both ways (up and down). We often forget that. Of course we are all focused on more and faster.
Many times innovation seems like a cool trick and the Coho architecture clearly is that. Great performance can usually be achieved by building a new system top to bottom. Coho didn’t do that. It took a few components that were designed separately. Then they effectively created an operating system for scale out file serving. The beauty of an operating system approach is that future performance benefits are accreted to the customer as all of the silos of technology improve on their own. This is similar to benefits end users get for their desktop applications as things like graphics and system I/O improve. A properly designed operating system doesn’t require massive redesign or end user retraining as hardware innovation occurs under the covers. The user simply benefits. The user benefit also is about choice. A user can decide the level of performance needed and can pay up to higher levels as needed or not. The Coho architecture is centered on the same design philosophy given a set of hardware components that have baseline capabilities. Of course if you look at the founding team of the company one would expect nothing less.
As proud investors in Coho Data we look forward continued innovation that will benefit businesses of all sizes along all performance and cost curves. Congratulation to the team on building a product that will change the face of storage. More to come.
Today CSC (Computer Sciences Corporation) agreed to acquire ServiceMesh. We at Ignition invested in the company more than two years ago before anyone was really saying much about hybrid anything other than vehicles. Our thesis was that hybrid would win and that ServiceMesh was defining what hybrid IT means. As THE proud and only venture investor we are excited to see the ServiceMesh vision (which is already deployed in multiple large organizations) now multiplied by the capabilities of CSC on a global basis. I thought it would be a good time to reflect.
Over the course of the past few years a lively discussion of which is the right or winning cloud type has gone on and on. We have private, public, virtual private, hosted private, and then the variants of PaaS (public and private) and let’s throw in SaaS just for good measure. All of this collectively can be described as Hybrid IT or more popularly Hybrid Cloud. I won’t try to link to a definition for all of this because well the definition is just “all of it”. So the winning model will be “all of it” but working together in a way that at least seems coordinated and coherent.
I use analogies from the earlier days of IT to draw parallels to where things are going. I’ve started using the first PC style file servers as the initial examples of hybrid IT. In the 1980s the PC came into being and we all had islands of storage contained on our PCs. It was possible to store information “off machine” in the first clouds that were merely telnet or ftp servers or in places like AOL or Compuserv or on a company owned mainframe system (anyone remember Irma). Each of these off machine places had a unique way of getting files back and forth to our PCs and it was way different than the file access we had locally on our PC disks. So the end user had to know specific commands and had user ids and passwords all that good stuff. The application software (word processing and spreadsheet) of the day didn’t bother trying to figure out how to store files in these clouds. It was the user’s job to move files around and understand the protocols (modem or 3270) and unique commands to store and retrieve things in these places. So the PC was our private place and these off machine places were our clouds. They were separate and apart. The local area network, the PC style file server, and some clever software changed all that. A software trick enabled disk space on the file server to appear to the LAN connected PC as if it was a locally attached disk. We called this technique redirection at the time but it really was a form of virtualization and namespace management. The magic enabled application software and end users to use storage in a hybrid way without the user having to think (too much) about where things were. This hybrid storage between PC and server gave the IT folks the ability to provide real services for their end users while getting some control in the form of data governance and protection.
Fast forward twenty years and we can talk about all IT services in a similar way but now the dividing line is a company’s network and the outside world of the public cloud and SaaS. Its way more than the “where it is”. It’s clearly about how it’s delivered. It’s not enough for IT to simply allow the use of off network resources. The combined on and off network resources need to be presented and accessed in the same way. The unit of access and definition is no longer something that looks like a disk drive but rather an entire complex application or a collection of applications and other resources that would look a lot like a datacenter. The hybrid cloud is all of this together and appearing to act as one. The challenge is that very few of these components were designed together and nor should they be. As a result each component has its own way of being used and accessed (ie. AWS compute is different from using an HP or Dell Server in every way but both run VMs). Software systems that enable hybrid cloud result in organizational agility by delivering a prescriptive model for incorporating arbitrary private and public components that end up looking and acting like a cohesive whole. Governance and compliance is a centerpiece here. A complete hybrid cloud framework won’t let users do the wrong thing (like replicate a private database off premises) but it won’t get in the way of users doing their jobs (like enabling prescribed use of elastic computing quickly). The file servers of old did the same things like enforcing disk quotas or setting read/write permissions on file shares. The user didn’t need to remember what was allowed.
Of course the framework needs to adapt over time to incorporate things it doesn’t know about today but a proper architecture will support that without user and application disruption. As businesses of all sizes adopt the many kinds of cloud computing resources the only way to realize full potential is by deploying a hybrid cloud system early in the cycle such that a culture of inclusiveness as opposed to silos takes root early as opposed to late. By doing this the ability to use “all of it” becomes a reality as opposed to something only presented at a conference. When I met the team at ServiceMesh a few years ago I saw the new hybrid IT for the first time. Congratulations to them and our new friends at CSC. Best speed on the journey ahead together.
The notion of personal business intelligence may seem a bit odd. The fact of the matter is that we all create, consume, view, and bookmark a large body of content each minute of our online and offline lives. Those of us that work for businesses of all sizes (and that means anyone who gets paid or attempts to) face a growing number of content places and types all the time. The network we have to maintain in our brains to remind us of the “what and where” is challenging.
In the “client-server” days we had our inboxes, file shares, internet, and intranet sites. It was fairly easy to create a map of where things were and people only had a few places they could really create content. Line of business applications accepted content from people or systems but in very structured ways and, in general, provided fast access to the content since the applications were specific, single purpose, and generally written by the in house IT staff. Everything else was stored on file servers or e-mail and the external internet was merely a collection of websites one could book mark or throw a search engine against if you forgot something.
A few trends together have changed everything:
· The rise of SaaS as a buy versus build alternative to bespoke line of business applications (Salesforce.com, Workday, Concur versus in house apps)
· The linkage of personal use consumer systems like LinkedIn (social network) or DropBox (sharing) to our business workflows
· Mobile devices and networks as a preferred consumption and convenient publication point
· Adoption of rich collaboration systems (Sharepoint and Box.Net)
The silos are not only in different data centers but are from different vendors and the applications all have different search, retrieve, and share mechanisms. The need to get at everything from a mobile device can be challenging not only due to form factor but also because authentication and security when you come in from the hostile public network is different from what happens on your laptop. Yes I acknowledge the existence of the mobile VPN but can’t prefer it. And of course let’s not forget that all of these content stores encourage us to create lots and lots of data as we share ideas, opinion, and intelligence in ways that simple file creation, email forwarding, and line of business application data collection didn’t. And this is all a good thing.
It is good because organizations comprised of people that document and share information openly are better organizations. Some will say that too much data can cause the proverbial information overload. But on balance, most of the time, more data is better than hidden or missing data. Also and often overlooked is the fact that the best of new silos make their data available via programming interfaces. In house line of business applications never really did that meaningfully so the data was captive until the central IT folks got around to “mining” the database. The strong, behind the scenes structure prevented the free flowing and on the fly sharing that exemplifies the most agile organizations. New applications will go after that data and connect it.
As we continue our march onto the new business software platform (cloud + SaaS + mobile + data enabled everything), the opportunities for efficiency at the individual, group, and company level will leapfrog the state of the art of just a few years ago. I hear too many people talk about how we will drown in data. There is no need to drown. Solutions will embrace the deluge and harness its power. Core technologies like Map-reduce and NoSQL are just the beginning just like the relational database servers that emerged in 1990s to power the client server movement. Elastic storage and compute are the fuel. Fast networking and open data interfaces are the pathways. We’re on our way and I’m looking forward to a new wave of innovation over the top of all this. A couple of Ignition companies are working in this space now. Both SnapLogic and stealthy TipBit are leading great innovation and have embraced the trend and we are proud to be investors in them.
The show had a good line up of companies once again and one of my investments, AppFog, won the People Choice which is the “hot company” online vote by people attending or not. The keynote from Yobie Benjamin, Global CTO from Citigroup, had some really good information and its worth watching it on the streaming site but as a startup I would be discouraged if I was trying to sell to Citi. His pre-req for any conversation is whether or not you will be “around” and he defined that by having a big slug of money of the balance sheet. It’s a fair statement from the CTO of Citigroup but flies in the face of the show theme. The Consumerization of IT is driven by IT Professionals in the workplace doing and delivering the kinds of things they do as consumers. Full stop. I’m not sure anyone called the CFO at Dropbox and asked about the balance sheet before throwing a few photos in there. I’m not saying that Yobie is wrong for his business but it’s not consistent with the theme of the show and the movement around Consumerization. That said, I really liked his talk and I’m super happy he made the time to be there.
The presentation format is pretty tight. Each company gets six minutes plus QA time from the judges. I thought Peter Yared from CBS Interactive was the best judge in terms of the thoughtfulness of questions. All of the judges did well and it’s great of them to take time from their busy schedules to do these events. I do judging and panels as well and to do it right takes effort. You can’t just show up.
One thing that I believe was missing was requiring each presentation to open by telling the audience why the company fits into the Consumerization theme. Minimally the moderator for each section of the conference could have explained why the category of companies in the section fit into the theme. I don’t think that just assuming because an IT Pro can use a product by swiping a credit card is enough.
Strikingly absent from the program were any companies doing hybrid cloud with real management beyond the core infrastructure companies (the infrastructure section) which were well represented by Cloud Scaling, Piston, and Zadara Storage. These types of companies need to tell the audience that they enable the central IT folks to deliver IaaS to their business units that “feel” like AWS and/or deliver AWS-like benefits while still offering flexibility of on premises or off plus internal controls. None of them really made the pointed statement but I do think good listeners might have got the message. Then again they only get six minutesand they did good jobs of representing themselves in that short time.
The section on PaaS was well done and informative. The companies in the section were pretty mature and included Apparbor, Cloud Bees, Cabana, and AppFog. In the past, if I had to describe why PaaS fits into the Consumerization theme, I’d have to stretch a bit. But this show got me to thinking about it. I think Lucas Carlson from AppFog had the seed of it though. He talked about the elimination of “ops” for the developer. That is, the developer focuses on code and not operating servers and keeping things up to date. So in a sense, PaaS in general enables a developer to have an experience that is akin to the simple store and retrieve of files or photos (like Dropbox). In the PaaS context, I drop my code in the PaaS and away it goes. The Dropbox user doesn’t worry about load balancing the server her files are stored on or backing them up or doing patches. The developer using a PaaS gets that experience plus a whole lot more. You don’t need to work at a big company that has a devops staff that builds and maintains the internet facing platform. The PaaS provider does it. So software development orgs of all sizes can get the same benefit. This is clearly part of the theme. None of the presenting companies made the explicit point and they should! And yes they all pay as they go so its consumer like of course!
It is still early days for Consumerization of IT to be woven into company messaging. The smarter companies will not just market to it but believe in it and build for it. They will be rewarded for their efforts as businesses of all sizes will be able to adopt their technology in ways similar to the ways IT Pros and end users do in their daily lives. IT Pros will bring in tech from small companies that innovate fast, enable low cost of adoption, and deliver a level of simplicity in line with the stuff they use at home. The central IT folks will adopt the more comprehensive solutions that enable their own organizations to deliver a cloud-like experience to their business users. That’s what this trend is about. It’s way more than pay as you go.
View Frank’s full appearance on Fox’s Varney & Co. here.
We know we’re in the midst of a full hype cycle when we see conferences and blogs dedicated to a technology topic. This month, I am attending my first startup show with Consumerization of IT as the theme — Under the Radar, on April 26, in Mountain View, Calif. There is even an acronym now, CITE or something like that, and of course some hash tags. But, let’s step back. What does the Consumerization of IT really mean?
A few years ago, I heard my friend (and boss at the time) Mark Templeton, CEO at Citrix Systems introduce the term. It was in the context of a Citrix appstore that enables business users to search for and select applications the same way one would on iTunes (and in the context of BYOD). I thought, “well OK this is pretty cool,” and it was a nugget of truth and value. It would be nice to just “find” my business applications at a well-known and trusted place. Mark is a visionary kind of guy and on the leading edge of thinking and messaging. Naturally this instance was no exception. He was ahead of the curve once again, as at that point in time, the popular press equated consumerization/BYOD as a snooze fest.
In the past few years, I have observed a lot of folks trying to define and characterize this thing called consumerization of IT. It is multi-dimensional in nature, and there are aspects for both the end user and the IT Professional.
End users are more and more becoming, or already are, digital natives. They are totally comfortable with technology and expect things to just work. The notion of compromise really doesn’t exist and products need to unfold themselves naturally before users eyes. They are accustomed to “just using something.” None of them read a comparison whitepaper before deciding to use Google or Bing, Zagat or Yelp, Hipmunk or Kayak. It is an instant decision based on results and feel. Likewise, none of these users went to a training class to learn how to buy a book on Amazon. It was intuitive; they did it once, a book appeared, and no credit card fraud occurred. Then repeat and boom; success for Amazon and a new way of purchasing books and other goods was born. I’m definitely not #justsayin. This really is a big deal.
For the IT professional, there is a desire to bring in products and services that mimic their own experience as consumers. It’s easier and less risky, and definitely costs less to get going. What IT professional wants to call the budget folks, to ask for funds, to buy a few servers so they can try out a business application product such as a workforce management system or expense management product? Then there is the time it takes to install software, figure out data protection, keep the application up to date, etc. Yes, custom in-house applications are still need in-house (topic for another post on hybrid cloud) but packaged software? The IT pro just wants to do a little research, cruise over to a SaaS website and start trying the application. If they like the application, they pop in their credit card number and boom they are a customer. If they don’t like the application, they let the trial expire. No capital is spent and no wasted time on infrastructure aspects that don’t matter. A subtle note here is that the IT pro probably never talked to the product’s publisher or worried about the size of the company that made the application. The user based his/her decision on utility, quality, word of mouth credibility, and voice (like on Twitter) of the producer. It’s no longer about the size of the sales force or the D&B rating. Its all self-service, and there is the growing phenomenon that users are adopting technology from much smaller outfits than ever before.
The consumerization trend is here to stay. It doesn’t mean that all applications move offsite. It does mean that the products that are on premises and off premises, application or infrastructure, need to adapt and enable agility and frictionless deployment. And, one last point; applications better be accessible from a smartphone. Think differently, said my friend Mark. It’s nice to see consumerization and BYOD) happening widely now.
As we look back at the history of computing, it’s clear that each wave ushered in new rounds of groundbreaking technology that birthed new companies, increased productivity, gave rise to IT, empowered businesses and changed the world. In this blog post, I’ll take a look at early technology waves, reflect on how they changed IT and look at the most recent trends and the opportunities they usher in for innovation.
We are experiencing a number of shifts and a true evolution in enterprise IT. It seems like every day there is a new term being coined and new trends “up and coming.” During these past several decades, a few major waves come to mind and can be identified as ongoing trends:
1. Mainframe / mini era (1959)
2. Networked desktops and client server (1986ish)
3. Browser based and app server (1997)
4. Mobile and “cloud” (2008)
Each of these major trends caused transitions that led to a new way of interacting with technology. I like to call this shift, “re-platforming.” It’s bigger than merely a transformation since it affects the way things are “stood-up” in an enterprise. As this re-platforming is a catalyst for IT, which constantly needs to reposition/rebrand itself to meet current times and the needs of its users.
I don’t have first-hand experience working with mainframes and mini computers, but I did see them fade to the background with the proliferation of microprocessor-based systems in the 1980s. What makes this interesting is with the arrival of the microprocessor-based PC those in the mainframe/mini industry belittled it for not being a “real” computer. Yet, use of the PC grew and soon it took over jobs previously only done on its bigger cousins, such as data entry and text editing – use cases which opened the door for disruption and innovation, arrived by way of Lotus 1-2-3 and Microsoft Word. The PC no longer needed a reason to be, and it ushered in a completely new definition of a computer.
Networked desktops and client servers
As technology advanced, the PC became more powerful and technologists began looking for ways to improve on their performance by connecting computers together, leading to the development of Ethernet in 1980. Ethernet allowed PCs to be connected together, and soon the notion of networking the PC (client) to a host (server) was born. The idea encouraged openness and commoditized hardware and software and gave rise to the idea that your client could be anywhere; it no longer had to be in the same building or even the same state.
Browser based and application servers
In the mid 1990s, the Internet began to take hold and IT experimented with the idea of using a central server to house an application and using the Internet as the access point to the application. Since networking and server disciplines need to exist as a prerequisite, the build out of client-servers laid the groundwork for the application servers – if your hardware (client) could be anywhere, why couldn’t the application be anywhere? This brings us to the browser and application server era. The growth of programmable web servers and browsers shepherded in application platforms like BEA and .NET, and the broad deployment of software such as enterprise resource planning (ERP) and customer resource management (CRM) empowered businesses to garner more value from their data.
Cloud and Mobile
More recently, IT has begun to ride the cloud and mobile wave. Everyday I see new companies and innovative technologies that are emerging to leverage this trend. While we’re still at an early point of adoption with cloud, it is clear that it is and will continue to be a huge game changer for IT.
Cloud computing encompasses many things and I want to look at both public and private (hybrid) clouds and emerging technologies, which include the development of “as a service” platforms, mainly:
· Software as a Service (SaaS) – the new way software is delivered
· Infrastructure as a Service (IaaS) – think of it as the new server, network and storage
· Platform as a Service (PaaS– the new developer tool stack
These emerging technologies are having a huge impact on IT and are put to work differently based on the specifics of each enterprise’s requirements. On-premise, or private, cloud computing (IaaS and PaaS) is important for enterprises looking to maintain the privacy aspect but still receive the same self-service semantics as public cloud. Public, or hosted, cloud computing enables central IT managers to have the flexibility to broker various services to their business users, saving dramatically on costs and time.
In mobile, as always, history repeats itself. For just as the mainframe computing world belittled the PC, so the PC world belittled the mobile handheld device. Think back to 2007, when the iPhone was first introduced. The cool mobile phone was the small sleek Motorola Razr – and its champions poked fun of the iPhone and called it a brick. And yet, what happened at the end of this year’s Super Bowl? The TV cameras charged into the field as the Lombardi Trophy was about to be awarded. What did viewers see? A sea of iPhones in the hands of NY Giants players as they rushed to capture the winning moment. Yes, the iPhone and the other smartphones it ushered in have won. They are first-class computing citizens with capabilities that laptops lack including location-based services and truly continuous connectivity. Mobile devices as enterprise IT endpoints are no longer the exception but rather the rule.
Users are enamored with their smartphones with always-on connectivity and easy access, visually exciting applications that find their favorite restaurant, or keeping them connected to family and friends. These very same users want these attributes in their business applications too. They want to point (or touch) and shoot, and within seconds have their application up and running. They don’t want to have to enter a URL in a browser – it’s the last resort now. So on the surface one may say “so what” the phone is a micro computer with a little OS and some APIs, just hire a developer to create little applications for that small screen. If only it were that simple. The phone is outside the corporate network completely and the apps need to deal with network latency that would give inside the house app timeouts left and right. The phone doesn’t readily give an end user the opportunity to authenticate with Active Directory and needs its own functionality and development framework, which turns out gave birth to mobile PaaS. The establishment never sees the disruption in its true glory.
What about getting apps on to the device and managing them? Is this an IT function? What traditional PC and app lifecycle management tools are built for this? Ah, the web app. IT moved away from heavy weight apps a while ago and now we’re back to doing that again for mobile. So the net of it is all the challenges equal opportunity where desire is high at the point of attack.
The next big shift
With each new trend disruption followed and brought along opportunity. Opportunity for the next brilliant mind to create a technology that saw the solution to the obstacle. Each trend was accompanied by the creation of new companies and technologies. What disruption will arise from the mobile and cloud trend?
I’m betting on technology born in and enabling the success of big consumer facing properties like Facebook, Zynga and Google as the next shapers of enterprise IT. Examples include NoSQL, Hadoop and social mechanics. Unstructured data is growing faster than structured data and is being mined for business intelligence and productivity. Organizations are attempting to leverage social networks, fascinated by the interaction people have with each other and their ability to rally a protest or crowdsource the facts of a news story. It’s only a matter of time before our yearbook photos and status updates are part of the company directory.
We’re at the start of a wave and new companies are being created on a daily basis to lead the way to innovation. Each previous wave resulted in the creation of great enterprise software companies and we are sure to see this continue. Disruption creates opportunity and the desire to grab the opportunity yields outstanding innovation. At Ignition Partners, we are investing in this latest wave now and will continue to invest in technologies that are addressing the needs of future waves. As operators during the previous waves, and now as investors, we’re excited to be part of it.
First, Techstars Seattle Demo Day. What a super event, lots of coverage of it. Great young companies, enthusiasm, great pitches, good progress in fundraising. Big audience with great energy. Super job by @andysack and everyone involved, a model for everyone else in the Seattle community who wants to nurture startups. We need more of these events, not just in cloud/web. I’ve seen a lot of entrepreneurship events at UW and they are constrained by mentoring, hiring, seed financing — exactly what the techstars guys are providing. One of the companies, Romotive, has also done a great job leveraging Kickstarter and have generated a lot of early revenue — the rise of crowd-sourced pre-sales/funding is a fascinating and positive evolution.
Everyone was hiring at the event. As an indicator of how desperate people are to hire, I had two guys try to hire me. If you think I am the answer to your problem, you are pretty desperate.
Then I spent the better part of a day in a meeting with the UW College of Engineering Visiting Committee. Some great data on the College of Engineering — most programs are massively oversubscribed, turning away students in bunches, doing a great job placing students. Great evolution in programs, great facilities, great staffing. The College could probably push out many more engineers and is constrained by state economic policies; with tweaks to tuition and governance, it seems like the pipeline could open much more broadly. And we also had a chance to listen to President Young speak who seems to have a very open attitude about IP licensing, he seems to recognize that getting IP out of the university and to work is important.
I left the two days feeling like a lot of piece parts are coming together fast. Seed funding. Crowd sourcing. Mentoring. Training/Education. And with iteration and tweaking, we could see an explosion of economic growth in the Seattle area. Exciting times.
With the second annual Hadoop World this week, taking place November 8 and 9, it got me thinking about a few things I’d like to share. I attended Hadoop World last year and there were around 1,000 attendees; this year’s event is sold out.
It seems that an overarching theme for conferences this year is the move from definition to implementation. For example, at VMworld last year I noticed the majority of the conference was dedicated to defining the cloud while this year’s conference in Las Vegas featured sessions that illustrated “how-to’s” and cloud use cases.
In my experience, this transition always is a key metric of a specific type of technology gaining velocity. Makes sense right? Conferences begin to reflect what the majority of people are talking about. Imagine, in 2009, there were hardly any cloud computing conferences and now it seems as if they are sprouting up daily; some better than others obviously.
Based on all of the real customer interest and accelerating deployment of Hadoop, we decided this year to join the momentum as an investor. We are proud to now be part of the Cloudera investment team. We made this decision based on the company’s desire to produce a true platform for data. Our partnership has deep expertise on platform building including DOS, Windows, Internet Explorer and Windows NT/2K, and Xen at the API and hardware level. A platform by definition is something upon which other things stand. A data platform is what Cloudera is all about. Great applications will be built on that platform. The Cloudera team is going in the right direction and we’re excited to help accelerate that. See the press release here. http://www.marketwatch.com/story/cloudera-nets-40-million-in-series-d-funding-round-led-by-ignition-partners-2011-11-07
So for the conference last year I noticed that the sessions were responsible for educating the audience. This year’s agenda has an overwhelming number of sessions defining the specific types of problems Hadoop is solving and features new and innovative ways in which people are using the technology. It seems that increasingly we are seeing organizations, including startup Tresata, introduce new capabilities that would not have been possible without Hadoop. Its elastic processing capability is so adept and I’m sure there will be many discussions around it.
This year’s event is sure not to disappoint and I’m really looking forward to the keynote from Cloudera’s CEO, Michael Olson, and to hear about the future of the project. And again we’re excited to be part of it!
So Nest is the newest shiny toy for the tech industry and media to get all excited about, a ton of coverage this week — for a thermostat. Obviously some of the ardor will fade — how long can anyone stay excited about a thermostat? But I do think there is a theme here which has some enduring value.
I’m not really that excited about the UI and learning features of the Nest thermostat. I am able to navigate my smart thermostat today, and I just don’t need to futz with it very often. In our new house it took me a couple days to get things where I wanted them but I’ve moved on and haven’t had to look back.
But I am totally excited about the remote access for the Nest thermostat, the web interface. Our houses are the biggest asset we own, and the cloud presence of our house is either missing or spewed all around the web in random places. There are so many things I should be able to do:
- Remote utility management. Remote thermostat is a nice start. I want remote utility management in general — what’s the temp right now, what’s my water usage, change my temp, change my water heater temp, turn on/off my sprinklers, check my power usage, turn on/off appliances/circuits, check my usage and billing history, etc. I can get pieces of this but it is hard hard hard to get it all and to integrate it all into a single cloud interface.
- Remote security. Webcam monitoring, alarm monitoring, history of access to house, remote door lock management. Again you can get piece parts of this but cobbling together is a significant pain.
- Remote doorbell. When someone rings my doorbell, I want an instant notification on my smartphone, I want to see the video feed from my door, and I want to be able to talk thru the intercom. The person at the door should have no idea if I’m in my kitchen or on a business trip. This is part of the security topic but is more compelling than most of the security features.
- Bills. Utility bills, consumption history, how I compare to others, bill payment.
- Financial info. Mortgage status — balance, rate, is it time to refi. The estimated sale value of my house. Mortgage document storage. Tracking of improvements to the house — costs, documentation — so I can correctly calculate cost basis at sale time.
- Service. All the warranty terms and docs for all the appliances and other features of my house. A place to track service records, to record preferred vendors, to get vendor recommendations. A service advisor — what maintenance should I expect to do in the next year based on what is known about my house — time for roof inspection, approaching lifetime of water heaters, time to repaint, what is my likely cost in the next year for all this.
You can get a ton of this info today but it is spewed all over the web. To access all the info about your house, you would have to access the Nest site, any smart metering site, a remote security site (or several for webcam, door locks, monitoring service), each of the utility websites, your bill payment web site, your mortgage provider website, zillow, redfin, etc.
I’d love to have a portal that integrates all this via user configurable widgets into a single interface — my home at a glance. And gives me great mobile access to all the info and features. And just gets better as I add nicely designed devices into the house — a Nest thermostat, a great doorbell/webcam, internet-controllable door locks, etc.
I’m sure the Nest guys are thinking broadly about the entire space, with a general name like Nest they must have ambitions beyond thermostats. I’m excited to watch their evolution. I’d love to have better command of the largest asset I own.
Daring Fireball points to a pretty thorough takedown of QR codes as used in print ads. The original design goal — Toyota invented these to track parts — makes sense, but jamming these into consumer media is just strange.
- Users can already type in your URL or a sentence, or speak into Siri, or do an image search with their phone. Is taking a snap of this code thing really so much better?
- There’s a history of companies trying to stuff proprietary ID systems in between users and product/service providers. These visual codes are one such thing. AOL Keywords, RealNames are text-based equivalents. They all try to get advertisers to stuff these in ads, but I don’t see how this really serves users or advertisers, it mostly just serves the companies with the proprietary ID system.
- Ultimately, if your product/ad/message is so forgettable that you think jamming a QR code or text string in will help, well, there is a deeper problem.
About a year ago Simon Crosby (then Citrix CTO) and I talked to the folks at Blue Stacks about the concept of virtual-izing Android apps on Windows . At first mention of it, more than a few people look at you kind of funny. But five minutes into the conversation the light bulbs start going off about the possibilities. What hit me was the fact that the cool Android apps are real apps that, for the most part, take advantage of the local processing platform including graphics acceleration and object storage. The counterparts to many of these apps on a Windows PC are usually web apps that are also accessible from any device. The web apps have great reach and in general enable fuller access to certain parts of applications. The Android app design center (and mobile app generically) is much more focused on the heavily used portion of the app. This is largely due to screen real estate and the touch versus keyboard and mouse input. Navigation to what you need or want to do is heavily streamlined resulting in a simply cleaner day to day experience.
Ok so why bother with this on a Windows laptop or tablet? An observation I had about two years or so ago around the activity of enterprise developers was the first thing that got to me. I’m referring here to the millions of people that work for businesses of all sizes and develop in house applications. The best endpoint developers at the largest companies were spending the bulk of their time getting their mobile app chops together. The tooling was kind of shaky for large team development but the best code jockeys were starting to write apps first for mobile deployment while keeping a web app hanging around for non-native platforms. What I found curious was the large number of developers targeting Android versus IOS. I expected a landslide in favor of IOS but it wasn’t happening that way. The enterprise shops were doing one of two things:
· Do native Android and IOS and then web apps for everything else (I’m overloading .NET front ends as web apps here and ASP.NET is very common)
· Do native Android and web apps for everything else
One can argue and speculate for the reasons about IOS not having the landslide but from personal experience (as an executive working at Citrix) I can tell you that Apple in general doesn’t care about enterprise developers. They won’t make their money there so why bother and enterprise developer support is expensive and certainly not sexy. Well OK is that enough to make the developers swing to Google. Google might care about enterprises since they want to sell and office suite to them but again in general it is not in their DNA through the marrow. If developers made that choice based on the vendor caring about them they would be on WP7 or Blackberry. The enterprise guys and gals like deployment platform diversity. Iphone has lots of options right? You can get the blue one or the white one plus a couple of other cosmetics here and there. Suppose you want a bigger screen ? a brighter screen? a smaller screen? Foldout key board? Something with superior battery life ? waterproof ? better speakerphone ? some security widget? A cheaper device ? Well then you go elsewhere and the elsewhere is largely an Android device. Finally, the enterprise guys say the browser for iPhone and iPad is darn good. In fact its good enough to handle whatever they would write for the PC and Mac.
Alright so the browser on Android has to get good enough at some point (hey HTML5 will fix everything) so why write native apps? Well native apps are cool and learning to leverage a platform is cool and developers like to be the coolest developers. It happened with PC. I was one of hotshots sent to code PC assembler while my co-workers slaved away on mainframe and minicomputer COBOL. We were the cool guys who stayed up all night cranked out thousands of lines of code a day. What we did was harder. We got the pay raises. We spoke in a different language. We moved to C and C++ and built the first PC client server apps right on OS metal with nothing but a network transport to help us out. How does this translate today? The best developers will want to get the most out of the platform and will go as native as necessary to do so. History has shown this and will repeat itself. Reach is equally important but it lacks the emotion and passion of watching your code making the platform dance.
OK back the title entry: Android on Windows. As a developer if I can spend all my time working on the thing I have passion for and then use another technology to get more reach then I’m all over that. I’m especially all over it if I don’t need to sacrifice the experience I’m targeting as use case #1 in order to get reach. That is, layered frameworks to enable multi-platform deployment can be OK but separate the hotshot developer from the platform . Here lies the allure of Blue Stacks for me. That hotshot can now take her Android code and have it run on the PC laptop or tablet with no changes and delivering the same streamlined experience she built for the mobile platform. Sure she will have to maintain a web app for everything else but now all the PC users can get her latest and greatest whenever she moves the mobile application ahead.
The allure (for me) was for the enterprise developer. However upon release of the pre-alpha it seems like many people just want it for all their Android apps so I was wrong but in a good way! All developers then! So yeah “really” squared. I’m using the Android LinkedIn app on my laptop. I like the single pane without all of the extras I would use to “manage” my LinkedIn. It is like a little news feed with laser sharp access to the important stuff. Blue Stacks went live in pre-Alpha last week. In that week over <a nice number with lots of zeros> users downloaded the Android app player and are kicking the tires and then some. I like to be wrong like that! It gets even more interesting with the Windows8 Metro user interface where the Android apps will just take their assigned places on the canvas of the display with all the other apps (I have seen this working since I’m an investor). Now there is a puck for Blue Stacks to skate towards.
We invested in Blue Stacks in March 2011 and are excited to see the software getting out to end users in large numbers. New funding was announced this week including Citrix, AMD, and a player to be named later. I especially welcome the new investors as they will help the company in driving the agenda forward not just via their investment in dollars but in real business initiative.
The company is http://www.bluestacks.com
Follow me @frankartale
So I don’t really get iCloud storage yet, and Photostream doesn’t really accomodate all my DSLR pictures well. So rather than just whine about what I don’t have, what do I really want?
First — I have a 203G (gigabyte) Aperture library today, that is where my primary photo storage is. Digging into this a little:
- 54G is thumbnails, previews, cache of various sorts. 27G of thumbnails alone! Impressive use of disk space, Aperture. Clearly the team has embraced the idea that disk space is cheap and is getting cheaper. There are probably some settings I could tweak to trim the size of all this at the cost of performance, but whatever, disk space IS cheap, 30% overhead is probably not a ridiculous design objective. This is all derived data tho and could be trimmed, dropped, whatever, as I think about cloud storage.
- My masters are 149G. A mix of RAW and JPG depending on which camera/scanner I used and how long ago I took — tending towards more RAW over time.
- 19G from this year
- 34G from 2010
- 25G from 2009
- 71G from earlier years.
Lets assume I continue to take pictures at the last 3 year average rate for some time, that is about 25gig of new photos every year, not accounting for inflation in photo size due to better quality capture chips, light field cameras, etc. OK so you probably have to assume some growth in that 25gig of new storage a year.
Cloud storage of photos — is it important? Hugely so, if my house is burning down, I do not want to be running back in to save a hard disk, photos are emotionally very important. And I do NOT want to have to pick and choose which photos I store in the cloud — too many photos, not enough time, I just want the entire set up in the cloud. I really just want my entire Aperture (and iPhoto) collection replicated to the cloud automagically. And then I need some modest access control features on the folders in the cloud so that I can share selected photo sets with family members, etc.
So I want a cloud storage solution that gives me ~200gig of storage today at a reasonable price, and if I think about the next couple years, a clear path to 300-400gig. And with good web access with some security. What are my choices today?
- iCloud doesn’t begin to work. Aperture doesn’t really talk to it except for Photostream. The max storage I can buy is 55gig. There are no access controls. Doesn’t work along almost every dimension.
- Dropbox. I can get 100G for $240 a year with a nice web interface and some sharing controls. I could even get the team license, store up to 350G, but for $795 a year. If I had this, I could just move my Aperture library into my dropbox folder and voila, it would be in the cloud, on my other machines, etc. However — the Aperture library folder is not really meant to be browsed by humans, the masters are chopped up into some funky balanced tree of directories. Seems like Aperture needs to learn how to work with shared storage. But I could get everything in dropbox, with a very easy UI for me, but at a high price, and probably the ability to share folders with family members would be hard to realize.
- Box.net. Well I get 50G free with their iPad offer, so they pretty much trump iCloud. I could get up to 500G in a business plan for $180/year per user. Similar pros and cons as with Dropbox, but pricing seems better.
- Smugmug. This is what I use today. There is an Aperture plugin, I can save from Aperture. The bad part about this is that it is not automagic — I have to intentionally move folders up there, not happy about that. But — unlimited storage, at $40-150 per year for jpg, some extra cost but still cheap if you want RAW. A great interface for sharing, completely customizable, printing integration, etc.
For now …. Smugmug is the way to go, but as storage costs drop, I can see flipping to box.net or dropbox at some point. I’d give up some of smugmug’s great interface for admin control but that is overkill for me anyway. If Apple made this all work natively in Aperture at a competitive cost, that would be fine too. For people with a more modest set of photos, the Box.net 50G free offer for iPad/iPhone users seems like an awesome option.
I’m struggling to understand why I would ever use iCloud storage. After a couple days of tinkering, I have two sets of data in iCloud — device backups, and Pages/Keynote docs.
- I really don’t get the value of device backups. My apps are all recoverable from the iTunes store. I use primarily apps like Evernote that already store their data in the cloud so there is minimal non-replicated data on my iPhone and iPad. Music isn’t backed up, I will need iTunes Match for that some day. My photos aren’t backed up in iCloud, that is not something that is offered at all (and besides the photos on my device are a fraction of my photo content, I use smugmug and other paid services to back up all my photos). So what exactly is in these device backups that iCloud stores? and why is this substantially better than backups stored on my Mac — when will I ever use these backups? In sum — I’ve been explicit about choosing apps and configuring apps so that all my valuable data and state info is replicated and in the cloud, so that I don’t care if I lose a device (and can use multiple devices). So why should I care about device backups?
- The other files in my iCloud storage are docs. I have Pages and Keynote docs in iCloud from my iPad. If I was purely a Mac person, and didn’t collaborate at all with people in my office and business partners who use Office, then maybe I could just use Pages/Keynote on the Mac, and the iCloud doc storage might seem pretty simple. But I use a PC sometimes to edit my docs. And so I use Office so that I can work on my Mac or PC. And so that I can, with no fidelity loss, work with my colleagues on docs they have created in Office. I guess I could still move these docs in and out of iCloud storage, but if I am going to go to the trouble of moving docs around, why don’t I just move them into box.net or dropbox? They both have great iPad and iPhone interfaces, they work with Pages/Keynote on the iPad, I get 50G free on box.net, they both offer sharing options, I can create folders in them to organize my docs and control my sharing (Seriously, iCloud, no folders??), they let me store any kind of doc, they have great Mac/PC clients so that I can sync my collection with local folders easily, etc etc. If iCloud didn’t have the Apple brand, we would all be laughing at it.
- iCloud claims to store your music but practically doesn’t. I have 16,000 songs, 88G of music, in my iTunes library (and flac versions of all this but not in iTunes). 99% of it is from ripped CDs or purchased in mp3 format outside of iTunes. None of which iCloud handles, I have to wait for iTunes Match.
- iCloud stores your photostream but I’ve already talked about why this isn’t very useful to me.
- I don’t care about mail/calendar/contact backup as all mine is already stored on my Exchange server or Gmail server.
So iCloud storage is substantially worse than leading competitive alternatives for document storage; its only unique benefit is device backup, which I can’t figure out why I’d use; and it’s other features don’t really solve any problems. I am sure Apple will improve iCloud over time but as a storage solution it is underwhelming. Am I missing something? Does anyone find iCloud storage to be hugely helpful?
When I sit in Ohio Stadium for a football game, my fancy smartphone is a useless piece of metal and plastic. Some developers have tried to come up with apps to improve the gameday experience, but these apps miss the point. With 105,000 fans in the stadium, another huge set of ticketless fans milling around outside, all the stadium staff as well as security and service staff outside the stadium — there are probably 200,000 network devices in 30-40 acres all trying to jam onto the system, and all failing. The cell network simply can’t handle the load.
Our cell networks are wonderful things, but in the build out of our networks, the notion of broadcast has been left behind. 98% of the fans want the same exact data — top 25 scores, breaking football news, in-game replays, radio game feed. And yet the cell network and data apps feed this data to each user via dedicated single-user transactions. Cell broadcast exists in the standards but is not really in use in networks or handsets. Qualcomm tried to push Mediaflo for this use but got very little uptake and eventually shut down the service.
It’s unfortunate that the idea of broadcast has been left behind. It would be hugely useful in these kinds of crowded venues. I wonder if Qualcomm might not have succeeded had they just focused on NFL and NCAA football fans — people who spend stupid amounts of money on tickets and related gameday expenses, and who would probably spend money on dedicated gameday data services. It is not an easy service to provide tho. It requires spectrum, devices using that spectrum, and local content assemblage and editorial. There may be too many moving parts. It might be easier just to truck in lots of picocells to events and say screw it, dynamically expand the cell network as needed.