Day one of my journey to Tech Field Day #9 is being spent at the very picturesque, if not a bit hot, Dell campus in Round Rock, Texas. It’s always a pleasure to get to spend the day at a single site, as it reduces the amount of dead time spent traveling from one location to another (although it can be a pleasure to see a variety of places in a day). The gauntlet of topics that were queued up focus on a variety of different technical disciplines: data protection, monitoring, active solutions, and some geeky fun with the folks in the Dell Tech Center. You can view the recorded videos at the Dell landing page on the TFD site.
Note: All travel and incidentals were paid for by Gestalt IT to attend Tech Field Day 9. No other compensation was given.
In fact, the introduction for the day was handled by Peter Tsai with a low down on just exactly what the Dell Tech Center folks do. The gist is that they provide a technical haven for those who are interested in being a part of a larger conversation around various Dell products and solutions, without going down to the level of being tech support. The team boasts well over 5000 different piece of content, ranging from knowledge articles, videos, blogs, and news bits. After a brief quiz to see if we had been paying attention (and it’s always hard to break the ice for the first session), a pair of TFD9 delegates were rewarded with some Dell mug swag.
That’s a great start for anyone’s day, eh?
One of the founders of AppAssure, Dr. Srinidhi Varadarajan, was up next to speak on data protection at Dell. As a VP and GM of the Data Protection Group, we received quite the update on the data protection products from both a capability and roadmap perspective. Previously, the company had no backup IP in house and relied on various partnerships with other vendors. Now, however, the consumption of various products are being integrated. These include companies like Vizioncore, SonicWall, BakBone, AppAssure, and Ocarina Networks. Based on the presentation, it appears the ultimate goal of this group is to create a unified solution in Dell Data Protect.
With a cycle of development estimated at 18-20 months to blend the IP together, the end state vision is to form a singular solution with integrated support for replication, restores, backups, dedupe, compression, and all the other goodies that need to be baked into a protection solution. The Dell products, however, will focus on three different zones or sweet spots: The SMB, Enterprise, and Desktop/Laptop endpoints. Srinidhi clearly stated that all of the various products will talk to one another to avoid a customer rip and replace migration nightmare as they climb up the solution stack with potential future growth.
Why Split Into SMB and Enterprise Products?
Let’s begin with a bit of history. Typically the SMB product doesn’t have all the features of the Enterprise product, and vice versa. The generic reason given is that new kid on the block to the protection game has to shake things up and innovate with features that the incumbent doesn’t have, which often means targeting smaller organizations. Later, when the little guy is gobbled up by the big guy, you have feature disparity. Dell aims to change this by ensuring that the Enterprise product is in fact a super-set of all the features found in the SMB product. The goal is to avoid needing to deploy both products in a larger environment.
As for the split, the idea is similar to offering various sized cars. If you’re doing long haul trips, the Nissan Leaf just isn’t going to cut it. Alternatively, a short distance city commuter isn’t going to want a Cadillac Escalade. In today’s world, the interface to both will most likely be different, but it seems that the ultimate direction is to create some similarities between the management interfaces to avoid user experience pain points as skill sets attempt to float upwards from the SMB product to the Enterprise product, or for channel partners who need to float between the two.
The four tenets defined by Dell Data Protection are: Backup in minutes, restore in seconds; Flexible, application aware solutions for physical and virtual; Backup and business continuity in one solution; Optimize storage capacity and utilization.
We were shown a ton of products as Dell solutions, which reveals a wide variety of choice but also a large quantity of fragmentation. I would imagine this is a challenge for Dell and their associated partners to communicate down the channel, and for customers to get a firm grasp on what solution best meets their set of requirements and constraints. This seems like a solvable problem by means of consolidation, but I’m not so sure that’s the goal. In my perfect world, there would be a few core products that simply use a modular approach to become what is needed. Much like Voltron.
The final interesting tidbit revolved around anything-to-cloud protection. Dell Data Protection is looking at doing physical to cloud protection with self titled non standard formats in the cloud platform. The idea is that you are doing replication between non similar hypervisor while still being able to provide assurance that the workload has been replicated and ultimately protected by the cloud target. Cloud is obviously going to be a good chunk of the plan behind Dell’s DR strategy, as there is a respectable customer appetite to consume cloud in such a turnkey, on-demand manner.
However, time will tell how well the Dell solutions can ultimately encapsulate and protect workloads in such a workflow, especially when it comes to the orchestration and (perhaps most importantly) fail-back functionality. After all, it’s relatively easy to go some place, but typically much, much more difficult to come back from that place. This will be an especially critical area to prove out to ensure customers are comfortable and confident in the solution that is being purchased, especially assuming that the price point is palatable enough to end up on the IT budget (DR is usually the first thing cut).
The next session was delivered by Mattias Sundling, Product Manager, and Thomas Bryant, Principal Architect, with a topic of monitoring an environment with Foglight.
First off, huge props to Mattias on his education and funny slides. This is probably one of the rare times you’ll ever hear me compliment PowerPoint slides (I usually hate them) but he did an amazing job. It takes a lot of time and effort to come up with something humorous and informational and it is appreciated. Note that the Executive seems to have a beverage of some sort. Hmm.
There were a few distinct advantages I could see with using Foglight.
First, it handles both physical and virtual servers of all varieties. The targets for monitoring could be on VMware, Hyper-V, a wide variety of bare metal Windows and Linux OSes, Active Directory, Exchange, network equipment, and so on. As long as it can be polled via SNMP, SMI-S, WMI, etc. it can be monitored in Foglight. That sounds pretty sharp.
Second, the Foglight Management Server can sit inside of a Windows or Linux server and houses the analytics and rules engine for the solution. It is ultimately the engine that provides the front end user interface and does the back end polling. The team also claims that it is incredibly easy to install, with a boast of taking longer to download the installer file than to get it up and running. Bold claim, gentlemen! I’ll have to test that some time (unless someone cares to comment). Skipping ahead a little bit, I was excited to learn that the solution includes some semi-robust chargeback tools and requires no additional installation to get that working. I’ve installed vCenter Chargeback and can tell you from personal experience that it is a not-fun experience.
End-to-End Root Cause Joy
Expanding further on Foglight, we drilled down into some really impressive levels of visibility. By tying the demo solution in his lab to vCenter, the SAN, and a storage array, Thomas was able to show us into the virtualization layer, including the actual storage fabric (such as an MDS 9148 as shown in the demo) and the storage array targets and LUNs. The entire path was shown with informational points and statistics at each hop along the way. I would imagine that this would make finding root cause issue rather straight forward.
After the session was completed, we were escorted over to the Executive Briefing Center. It was time to get a live demo of AppAssure courtesy of Gina Minks and crew to show off some of the bells and whistles and have an offline discussion about the product. Since it was all done offline, I’ll keep my intel on the demo brief with a photo journey!
I will say that the demo was impressive in of itself, but we had a few bones to pick about how the product managed backups and the naming of various components.
And here’s a sneak peak of the demo, in which the presenter deleted both the log and DB drive from an Exchange server and immediately restored them using AppAssure in a clever way. I was impressed, and so was the room of delegates.
Dell Tech Center Visit
We also took a sneak peak into the world of the Dell Tech Center team and found them all hovering over a new Dell VRTX server. While we didn’t really go into any noteworth details on the unit, it was fun to see one first hand and take it a part (just a little bit, anyway).
I really enjoyed the day at the Dell campus and learned a lot about a company that has mostly eluded my radar. They have laid out some rather lofty plans for integration and roadmap, and I think that they have the staff on hand to make it a reality. The real question is how well they execute on this vision once the product is ready to go to market, including working their channel, educating customers, and continuing to be competitive in a rapidly changing tech market. After all, acquiring a large handful of companies does introduce an equally large stagnation period as one tries to cobble the various IP bits together, and this requires resources that could have been dedicated elsewhere. Time will ultimately tell, but I would imagine that it won’t take very long to see the fruits (or lack thereof) on this.
In my opinion, they are on the right track with a lot of the core messaging around the day’s events. The biggest challenges will be truly streamlining the solutions into a smaller number of offerings and defragmenting the choices – at least from a “31 flavors” perspective. Too much choice can sometimes inspire nothing more than indecision.
1,772 total views, 2 views today