Thursday, December 2, 2010

Storage Attached Network

Today I learned what a Storage Attached Network (SAN) is.

My employer is merging with another firm that provides the same services we do (Fee-Only Wealth Management). I can say this now publicly, since the press release has gone out. As part of this merger deal we need to find a way to share network resources with this other firm located in New Jersey. My job has been to work with a colleague in the New Jersey office to find an outsourced IT partner that would provide us with a way to do this.

We had a potential outsourced IT partner in the office today to present their proposal. They had e-mailed us the proposal previously and a component of it was, apparently, a Storage Attached Network. What is a Storage Attached Network?

I decided I should figure this out. I did online research. As far as I could tell, this was a box that housed a bunch of RAIDed drives and could hold a big honking amount of data. Great. But how is this different from any other network attached device (a NAS or a Snap or a plain ol’ server with lots of disk space?)

The outsourced IT partner had brought in an army of consultants from Dell to explain to us why we wanted a SAN. They had PowerPoint slides. They had jargon, all of which, they assured us we would be unable to understand (I love it when my vendors talk down to me!) but which they felt compelled to fling around. In the middle of their talk one of my colleagues mentioned that he used to work for EMC and, y’know, write software to deal with This Sort of Thing all the time. We think that might have shortened their talk a little and perhaps convinced them that we weren’t idiots.

They succeeded in explaining to me at least that a SAN was a very cool thing indeed, but they didn’t answer my fundamental question of what it was. And ultimately, as cool as the concept was, no one succeeded in convincing me or my colleagues why having one would help us with our main goal-being able to share network resources with the office in New Jersey.

I didn’t actually understand what an SAN was until later in the day when I wandered over to my colleague’s cubicle to ask him what he thought of the presentation. During our discussion he explained that SANs used to be wicked expensive and that no outfit smaller than American Airlines would be interested in one until (apparently) recently. “They don’t have Ethernet connectors. They have these funny looking things with light and fibers…” he said “Fiber optical connection?” I asked. “Exactly.” So that’s what one of these is. It’s a big honking network attached device that connects through fiber optics. Fiber is faster than Cat 6 (Ethernet) cable. That’s not a difficult concept to absorb. But somehow I hadn’t gotten it from Wikipedia’s article. The article did mention fiber networks, but it lead me to believe these were old school (from the 90s) and therefore I was unsure how relevant this information was in 2010.

The Dell guys, who talked at us for an hour, didn’t mention this. The outsourced IT provider didn’t mention this. But in a five minute conversation with my co-worker I figured out what the big deal was*.

What have I learned from this? Well it reinforces some of the happy teamwork memes I’ve been so busy absorbing at MBA school. Reading stuff on the Internet is not always enough. Sometimes you need an actual human, or, more importantly, a human who knows you and knows how to explain things to you to help you understand things.

*It also has other neat features which I gathered from the presentation, but those are mostly interesting from a disaster recovery point of view and don’t necessarily make it any more attractive than NASes and don’t explain why these things are so farking expensive.

2 comments:

Anonymous said...

Wow, Storage Attached Networks. Cool.

Unknown said...

See, I'm not sure I agree with that characterization of the difference. I'm not sure if this is a difference of personal style and approach, or if this is a fundamental difference in what we understand.

I view the primary difference between SANs and NASs as being a shift in who's in control of what. In a NAS, the NAS device is in control of how file names are mapped to physical disk - the interface the NAS presents to the servers that talk to it is one in which you talk about file names and contents of files that can grow and have changeable sizes - in other words, the interface the NAS presents to the servers is very much like the interface the OS presents to users.

With a SAN, the interface the device presents to the server is at the level of block numbers. Blocks don't have names, and they have a fixed byte size. In other words, this is the low-level interface a physical device inside the server would present to the server's OS, or the level you'd have to go mucking at if you were examining a sector-by-sector dump of a disk for data recovery efforts or forensics.

So with a NAS, it's like somebody shoved the network in-between the user-mode programs that think of data in terms of "files" and the OS's filesystem. With a SAN, it's like somebody shoved a network in the middle of the SCSI cable that goes from the server's controller card and the disk drive.

Now, because of this, a SAN only really works if the network cable is really, really fast, since the SAN is replacing what should by all rights be a local cable, and is talking to OS kernel code, which isn't very tolerant of long delays. (user-mode progams, on the other hand, can usually deal with the occasional random delay)

So I think a SAN is only practical if it's on fiber, but there's nothing preventing you from putting a NAS on fiber.