At a crossroads - Decisions and choices concept

I started my career in the information technology industry when I was young, so young that I was still in high school.  I wasn’t working at Burger King or the mall like most of my friends, instead I was putting my passion for technology to work bringing the people of the Cincinnati area dial-up internet access. A few friends and I had connected with a local businessman and, somehow, the idea to start a new company was formed.  In the beginning, it was a handful of us, with me and my friend Todd doing all the server and development work. We spent countless hours of the mid-90s building servers, creating web pages, and working hard in out tiny server room. It felt like the pinnacle of technology to me. The whole world was at our fingertips just waiting for us the capture it. Roles shifting and Todd moved on to bigger and better-paying things.  Soon after heRead More →


As a long time enterprise infrastructure specialist, I’ve spent countless hours trying to optimize the performance of environments.  Early in my career, I spent some time on a team who worked very closely with the monitoring team where I learned how hard it was to correlate the volumes of data collected. We were collecting so much data about our environment that it was almost overwhelming. Things like the temperature of the CPU, how many storage IOs were pending, and memory usage was.  We had all this awesome data and what did we do with it?  We set up monitoring to make sure numbers didn’t cross a certain threshold.  When it did cross that threshold, we sent an alert.  All this data at our fingertips and all we used it was for alerting.  I knew something was off, but I was green and didn’t  understand that we were missing the bigger picture. That was a long time ago,Read More →

JvY1pRW - Imgur

In my twenty years of enterprise infrastructure experience, I’ve noticed a few things that are universal to every organization.  One of the most universally time-consuming things about working IT is usually disaster recovery testing. We all know that business continuity is extremely important, but that doesn’t make testing and executing recovery plans any less expensive.  It takes compute power to takes full and incremental copies of the data and, of course, storage to house the backups.Organizations also spend weeks and weeks of people’s time planning, documenting, executing, and remediating disaster recovery plans.  Until needed business resiliency often seems like a waste of money and time – but that all changes when you need it. When finally needed everyone remembers what a great investment data protection is, but what about all the rest of time?  Can’t data resilience be more than a one-trick pony? The simple answer is “yes” it is possible to use all the data copiesRead More →


In the IT industry today it is nearly impossible not to hear the word cloud dozens of times a day, but many storage administrators treat cloud a four letter word.  The basic tenant for a storage administrator is to ensure an organization’s data is safe and secure.  If the storage administrator makes a mistake bad, bad things happen.  Companies fold, black holes collapse, and sun exploded.  NetApp is trying to change the minds of those storage administrators, and for good reason.  IT organizations are always looking to do more work with lesss money, and cloud storage can’t be ignored as a viable way to do that.   At Storage Field Day 9, NetApp talked a fair amount about how they are embracing cloud storage as key to the industry’s future.  No more a storage vendor affords not to embrace cloud storage,  and NetApp sees it as a key.  Part of the future for NetApp is expanding theRead More →

Pure Storage Logo

This week I’ve been spending some time at Pure Accelerate,  where I’ve been able to talk to the engineering and executive teams behind the new FlashBlade system. In an attempt to embrace its start up cultural roots, Pure Storage developed FlashBlade as a startup inside the company.  What that means is they hired new engineering staff to build a unique and separate product from the ground up.  The new team members, to keep the development secretive,  were not connected to other traditional Pure employees on Linkedin. While the development was largely separate,  some of the FlashArray development team did help where it main sense.  That collaboration resulted in a fork of the FlashArray management interface which is used by FlashBlade. The result of the startup of a company is a new and a unique product. The first thing to understand about FlashBlade is what it is not.  It is nor a replacement for a low latency andRead More →

If you’ve worked in IT for any amount of time you’ve likely heard the term “secondary storage” which you’ve known as a backup tier.  You’ve also heard of “tier 2” storage for test and development workloads not needing the data services of production.  These two terms have had very different requirements.  Backups target storage is generally cheap, deep, and optimized for sequential writes.  Test/dev storage, on the other hand, needs to have different performance since it has actual workloads. Cohesity thinks this needs to change.  They content that secondary storage needs to be anything that is not primary storage. Redefining a term and carving out a new market segment is no small task, but Cohesity shows some pretty interesting use cases: Data Protection for VMware environments – Once a hypervisor snapshot is created the data is sent to the Cohesity array where things like deduplication and replication can be applied. This gives you unlimited snaps without theRead More →