The Enterprise

In mid-July SanDisk announced their acquisition of Fusion-io and the acquisition was completed a couple of week prior to Flash Memory Summit. I posted my initial thoughts when the news hit the public, but I feel that it's worth doing a bit deeper analysis now that I have given it some more thought and discussed it with John Scaramuzzo, senior vice president and general manager of SanDisk's enterprise business.

SanDisk has managed to establish itself as one of the key players in the enterprise SSD space over the past few years. The acquisitions of Pliant in 2011 and SMART Storage Systems in 2013 provided SanDisk with strong expertise and product lineups for SATA and SAS SSDs but left the company without a solid long-term plan for PCIe. I heard Pliant's initial roadmap included plans for PCIe-based solutions as well, but it looks like those plans never materialized.

Up until the Fusion-io acquisition, the Lightning PCIe SSA was the only PCIe solution in SanDisk's enterprise product portfolio, and as a matter of fact that drive is internally a SAS-based design with a PCIe to SAS bridge onboard. In other words, SanDisk had practically zero real PCIe solutions for the enterprise, while at the same time SanDisk's biggest competitors, such as Intel and Samsung, have had PCIe drives for a long while already.

Fusion-io's 3.2TB Atomic Series SSD

Fusion-io's strategy and product portfolio, on the other hand, was a complete opposite. From the beginning Fusion-io has focused on PCIe storage, which dates all the way back to 2007 when the company released its first ioDrive that utilized a PCIe x4 interface and was capable of speeds up to 800MB/s. Not only was Fusion-io early in the market, but the company was also able to garner a few massive and very important clients – the most notable being Facebook and Apple. I don't think it's an overstatement to say that Fusion-io can be considered as the pioneer of PCIe storage because it was the first company to turn PCIe SSDs and storage in general into a large, successful business.

But stories eventually come to an end. The competitive advantages Fusion-io had were its PCIe technology and several high-level customers, but the advantages were lost when the NAND manufacturers stepped into the PCIe territory. It's nearly impossible for a company that has to source its NAND from a third party to compete against another company that manufactures NAND in-house since the latter will always have advantages in cost. While Fusion-io didn't lose its customers to competitors overnight, it's clear that especially Intel and Samsung snagged a share of Fusion-io's business in the past couple of years.

In a nutshell, the acquisition brings SanDisk the long-needed expertise in PCIe storage along with Fusion-io's broad PCIe product portfolio. The acquisition is now a bit over 100 days in and the Fusion-io employees have been integrated into SanDisk's existing teams. Initially Fusion-io's engineering team was separate and worked under Lance Smith, the former President and COO of Fusion-io, but Mr. Smith decided to leave SanDisk and pursue other options. Last week a data virtualization startup Primary Data announced that Mr. Smith has joined the company as the new CEO, which explains his quick departure from SanDisk.

All the engineering talent has now been unified and the team is lead by Mr. Scaramuzzo. With everyone under the same roof, the roadmaps are now in the process of being integrated to bring the expertise together. It will be a while before we see the fruits of the acquisition, but in the meantime the latest Fusion-io products will transition to SanDisk NAND for increased cost efficiency.

But what about NVMe? That has been the hot topic in the industry this year and I bet many of you are wondering what is SanDisk's and Fusion-io's play in that field. The short version of their strategy is that Fusion-io already has a technology called Virtual Storage Layers (VSL), which is essentially a driver/software stack similar to NVMe. The truth is that NVMe isn't really anything new from a technology perspective, but what makes it alluring for many manufacturers is the fact that the NVMe drivers are universal and already supported by the latest operating systems. Technologies like VSL are rather expensive to develop and require expertise because there is no framework available (i.e. everything has to be developed from scratch), but on the other hand an in-house driver like VSL allows for more customization and optimization.

However, that doesn't mean that SanDisk has no interest on NVMe whatsoever. The company sees that as the entry and mid-level enterprise SSDs move from SATA and SAS to PCIe, NVMe will be one of the key factors because of easy and quick deployment. For that market segment the NVMe spec and its limitations are fine – it's only the high-end segment where the benefits of VSL are more prominent. It's actually likely that many manufacturers will turn to custom NVMe drivers anyway for higher and more optimized performance, and in fact that is already happening with Intel providing its own NVMe driver for the P3600/P3700.

Lastly, let's quickly discuss the ULLtraDIMM. I wrote a quick piece on ULLtraDIMM right after Flash Memory Summit, but SanDisk has already scored Huawei as the third ULLtraDIMM partner (in addition to IBM and Supermicro). The first generation product that is currently available is internally based on a pair of SATA 6Gbps controllers, but SanDisk said that a native DDR to NAND controller is possible in the future if the market adopts the new form factor well. As usual, the industry is fairly slow in adopting new form factors, so it's hard to say whether NAND DIMMs will really take off, but it's a very interesting and potentially useful technology.

Final Words

All in all, SanDisk is definitely one of the most interesting NAND companies going forward. USB drives, eMMC solutions, SSDs and even the storage arrays from the Fusion-io acquisition are all built on NAND, which puts SanDisk in a unique position as it's the only NAND manufacturer that focuses solely on NAND products. The company can't turn to alternative revenue sources like e.g. Intel and Samsung can, but on the other hand that's also SanDisk's strength as all the know-how and experience in the company is related to NAND in one way or the other.

Ultimately next year will be crucial for SanDisk because it determines whether the company can materialize all the underlying potential from the Fusion-io acquisition and become a serious competitor to Intel and Samsung in the enterprise space. The pieces are definitely there, so it's just a matter of execution now.

The Client


View All Comments

  • mkozakewich - Saturday, December 6, 2014 - link

    Few people trust rating systems, though. What they rely on are hard capacity numbers, like sizes. GB, MP, GHz, HD or 4K...
    We need to push the speeds instead of the sizes. Sell computers with 500 MB/s transfer speeds, not computers with 120 GB storage.
  • Stochastic - Saturday, December 6, 2014 - link

    This seems like a solid idea. Reply
  • stephenbrooks - Saturday, December 6, 2014 - link

    YES. The disk speed statistics are really important to show because (A) they are often the performance bottleneck and (B) there are very large variations even between SSDs. Reply
  • desolation0 - Sunday, December 7, 2014 - link

    I wish we could find a single speed number, have it make sense, have all the manufacturers use it and not cheat, and have it be useful for day to day comparison between two drives. We may need some collaboration through an independent industry speed rating body. As is, a product may list a different read speed based on the compressability of the files or any of a number of other things that can heavily sway how useful the number the manufacturers are using actually is. It's a pain to see a cheap drive advertising the same maximum mb/s as the premium drive, when the difference between the two in real use can be obvious to even a moderate user. Then there's the traditional issue of changing components without changing the model number, so you may get a relatively large gap in performance even from what should be the same device.

    Personally, I'd like to see how useful an SSD would be in a university computer lab setting, with multiple users on the same machines with no need for large storage since the files get wiped from each system periodically. Make user verification pretty much the only bottleneck to swapping from user to user.

    Really though, I think it's the business route to home user success. Where time = money, you have to make it pay to switch. Make the technology cheap enough and useful enough that businesses adopt it even for their low value employee computers. As in, even if the computer is getting used exclusively by a low wage employee, it will pay for itself to have an SSD instead of a spinning disk. At $10 per hour wage, a $100 SSD pays for itself if it saves 10 hours of work for the user. Of course, the plan also has to pay for any technical overhead from the IT department to implement and maintain the new drives. If you also save them time, the drive pays itself off sooner. Add to their workload, and it takes longer to reach the point of paying itself off. Once in the hands of business users for their everyday workload, then it's a matter of having the users notice the difference from their home computer and associate the difference with having an SSD. If you can pull that off, then you'll get more adopters at home. They will want to save themselves time/money to do more important and fun things, like watch cat videos on Youtube.

    Then, you still need the computer manufacturers in on it. Make it an easy way to segment their market. Multiple storage drive base systems may be a market area the manufacturers should encourage. Maybe design a chassis to allow a novice user to easily swap in a better drive. Just get the main niches going, however you can.
  • Minion4Hire - Friday, December 5, 2014 - link

    Showing a side-by-side boot comparison is actually a really easy way to convince the average person of an SSDs benefit. It's visual, it's tangible, and something everyone can relate to. Such a comparison is how I convinced my group to only purchase new systems that include SSDs.

    With Windows 8 I'm not sure how good such a comparison is anymore, but 2 years ago it was impressive enough.
  • MikeMurphy - Friday, December 5, 2014 - link

    It matters in an enterprise environment, which is described in this article as a large part of Fusion-IO's business. Reply
  • Primum - Sunday, December 7, 2014 - link

    Once you use a computer with a good SSD, you'll never go back. It's not just the start up time, it's installation time, time spent writing to disk, time applications take to start up, and any other activity that requires accessing the disk. Nothing makes a PC better than a SSD. Nothing. After that it's probably at least 8GB of RAM, then you can worry about CPU/GPU.

    If you don't use a SSD every day then try to use a slow HDD, you don't know what you're missing.
  • Wwhat - Sunday, December 7, 2014 - link

    Personally I'd be hesitant to advise a non-technical person a SSD due to the unknowns regarding lifetime of a cheap SSD. I would feel bad if someone bought a SSD and the thing got warnings of imminent failure a year and a half later and they would not know how to recover from it and how to replace it and move their OS to a new drive.

    As for demos in malls.. that seems extremely silly.
  • Hrel - Monday, December 8, 2014 - link

    This is horrid advice, how can you possibly recommend cloud storage in light of nude leaks and NSA spying and the government currently trying to push through the TPP in secret?

    You must either work for a cloud storage company or the NSA. Worst advice EVAR!

    M-SATA plus 2.5" hdd is the solution, period. Cloud Storage is not an option, at all, for anything of a personal nature. Business, sure, but certainly not anything sensitive or secret.
  • spidey81 - Friday, December 5, 2014 - link

    "64/128GB size could potentialyl be cheaper than 500/1TB harddrive." This is what I've been wondering for some time. But my theory is it's all in the marketing. Will an unknowing consumer buy a laptop with a terabyte of storage before they buy one with 128GB? Of course they'll opt for the terabyte even though they'll never use it. I can't tell you how many PC's I've looked at for friends and family that don't scrape the surface of their storage capabilities. So to that end, it would be a combination of a good marketing team and well engineered internal specs. It will have to happen sometime I suppose. Reply

Log in

Don't have an account? Sign up now