Using Windows Home Server, part 2

I did say the last time that I’d download and install a Virtual Server image of  Exchange 2007. I did. And it was way too slow for comfort. So I went ahead and downloaded the normal install package for Evaluation (Microsoft give you amonth to try out the package) . I’m installing it now. By the way, Virtual Server works flawlessly on WHS, even with an image mounted and running – its just that my server didn’t have the Gigaherz to do the job.

WHS took an age and a half to get round to running the install. And is presently unzipping the files.

On to other stuff. I think that WHS really needs something akin to a scheduler for backups – even as an option. I say this because the times that it takes a back up are totally random. Sometimes its at the very beginning of the backup window on one pc, but it waits and backups the the pc 3 hours later. It can be totally erratic. It might be the fact that my backup window is 12 hours long.  I’m going to try cutting that to 3 hours over lunch ( I prefer all my to turn all my PCs off at night, including the server) and see if it makes any difference.

I think that the major gripe, if you could call it that, is that WHS spends most of its time doing nothing. And I’m talking about the time it dosn’t spend serving files and music. A stripped down version of Exchange in order to keep users emails in one central store is one idea. Someone suggested a central upgrade server for windows update on the forums. 

But all of the above are Microsoft add-ons.The big thing is that WHS will have an SDK. Imagine that. Need somthing? Build it. The question is how wide ranging the SDK will be in the first place.  The limits that are placed on using the server programatically will dictate in a big way how we can levegae the SDK in development ( i.e will we be able to access the backup engine? The Folder Duplication engine? will we be able to remote into client PC’s and perform tasks?). I can think of a few things I’d like the Server to do in its down time – like programatically checking that the server has actually backed up a client PC, or generating a XML representation of the Music Library and making sure all libraries are on the same page. Lots you can do.

As far as my Exchange 2007 Standard Evaluation install went, it didin’t go anywhere.  It resulted in a ton of errors when it carried out a bunch of tests and I’d ratehr deal with them when iIm not so busy. In any case at $699 a license, I’d rather spend the money on hard drives.

I was just poking around Outlook and came across Data File Management under the Files menu. It allows you to change the location of your .pst files. I’m wondering if I could copy them to WHS? That is, into a file the user has permissions to.And acess them from there. That means that, in theory, whatever changes you make are instantly replicated to your Outlook install that are configured to use that pst file. This raises all sorts of deconfliction and versioning issues – essentially hell. Or, there’s a Outlook add-on i downloaded some time ago ( don’t ask from where, please, but I think it was Microsoft) that backs up the files to a location that you choose. Thats another option.  

My adventures with WHS continue….

Google/Microsoft Wars

If you havn’t spotted it yet, there’s a new feature to the Google Personalized Home Page. It’s with a twist. The themes change depending on the time of day. Its a great addition to Google Personalized Home. The P.H is yet another Google inititive to become the start page of the internet. Themes just make it better.

I think the Microsoft/Google wars are limited to the internet.

The Wars arn’t really about the internet. The Wars are more about who influences the user more. If Microsoft got its collective act together and pored the same zest and energy it’s used up fighting Google into its proven, core product lines of Windows and Windows Servers, there is no doubt that we’d have a kick-ass OS that has people queueing up to buy. Microsoft need to desperately concentrate on the bread-and-butter of its business. I’m not suggesting that Microsoft will suddenly disappear in a Kansas Tornado. Niether will Google. But surely its better to get the job done right the first time.

On the other hand, I’m not suggesting that Microsoft give up, far from it. I’m suggesting that greater priority be given to the Windows OS. Micrsoft has many winning products out there – look at the 360 for one. 

Look at Google as an example. Google have always leveraged thier core product : Search. All their features have search at their core. Why? Google are the class of the world when it comes to search. Why waste that? Google are not diversifng into OS’s or games- they are completely focused on their core product: Search. What do we, the consumers, get? A kick-ass search engine.

While Microsoft’s  Web effort is valiant, laudable and brave, catching up with Google seems a bit too far fetched even for viewers of  the Sci-Fi channel. In short: Its a fine line between fighting a battle and, fighting a losing battle.

Building the Back-End

If no one has read the Ask the Wizard blog, don’t worry – I subscribed only yesterday. Although this is mainly a business post focussing on how building the back end of a service gives you more flexibility in the products that go live, I see a great many parellels with software development.

Much more simple to understand the FeedBurner example. We didn’t spend the first five months building those two services we rolled out in February. We spent the first five months building out the architecture for feed filtering and feed processing such that we could quickly deploy any new feed service we decided to build, and then we spent about a week building out those first two services. Yes, it was true that somebody could have built a competitor to what we launched in a weekend. However, we would be able to quickly iterate and innovate on top of our release, whereas the built-in-a-weekend competitor would have to keep building one-off services that would eventually either become untenable or require an incredibly long period of underlying architecture refactoring while we continued to innovate.

In software development, you can get right in there and do your bit. But thats all that your software does. If you spend the time creating a framework that supports the code that you later write, then its so much easier to add features and services becuase the framework has already been build and all the really complex stuff goes on under the bonnet. I supose it comes down to the adage: build it right the first time.

Hilary Clinton, Apple and 1984

How are they all linked (via The browser)?:

Its a classic. Couldn’t resist posting it.

They comment:

On the Web, the rules are much murkier. The standards set last yearfor Internet campaigning and advertising are largely untested. And they’re also largely irrelevant: the Hillary 1984 video has already been viewed some 400,000 times on YouTube, without anyone being able to say definitively who posted it. Even if we one day learn the person’s identity, whatever damage (or help!) the video might be able to achieve will have been done. With that kind of viral power, you can be assured we’ll be seeing a lot more “anonymous” videos like this one.

This kind of power really warps the political dynamic in a way that hasn’t really been tested before. The ability for videos like this to go viral and keep their makers anonymous gives the political operative another weapon to add to his arsenal.  With technology being used extensively in the run up to the 2008 elections, its going to be very interesting to gauge its impact on the voters.

Super Large Screens

  • I’ve seen not one, but two posts today about multi-monitor/super large screens.

The first is from Scott Hanselman on multimonitor setups:

While I was at the Eleutian offices last week I was impressed at their commitment to the multi-monitor lifestyle. I’m all about the Third Monitor (in case you haven’t heard, it’s one better than just two monitors) as areothers. If you value your time, you should think about getting the widest view possible.

The Dell 30-inch is amazing…they each had a Dell 30″ widescreen at 2560×1600 pixels, but they also had what appeared to be two 22″ widescreen’s also, rotated and butted up against the 30″ so their horizontal working space was 1050+2560+1050=4660 pixels wide. Glorious. I turned them on to (I hope) RealtimeSoft’s must-have Ultramon multimonitor tool. They were running x64, and Ultramon has a 64-bit version, so that was cool.

And Simon Brocklehurst points to this cool video of  the most advanced multi-touch, super large monitor setup I have ever seen:

The point is – we’ve been used to the desktop metaphor for user interfaces for a long time now, but still the “desktops” on our computers are incredibly small compared to our real, physical desktops. If someone gave you a desk in your work place that was 24 inches across, you wouldn’t be able to get much work done on it. And yet, a 24 inch LCD screen is seen as an extravagant luxury by many. Lots of companies give their employees 15 inch “computer desktops” to work on.

You can see his point. I have a 19″ myself that suites me most of the time. But somtimes its just so small.

I’m wondering, considering Simon’s point why there is still a stigma attached to multi-monitor setups in the work place? Cost can be out-weighed many times by the productivity benefits and its actually an incentive for businesses to do that. Space is a concern, but the setup Scott saw dosen’t take up thatmuch space. So I’m wondering why. Perhaps is slightly too hard for the bosses to believe that having an extra monitor or to check your emails or have a reference to what you’re working on in front of you is beneficial. It seems like a large outlay for little perceived return. Ha! Question answered.  Its a crisis of perception.  

As my PC is at home, I’m wondering whether the outlay for a new moitor is justified (given that productivity is not an issue) ?

Using Windows Home Server, part 1

Well, now I’ve got my trusty Windows home server working, I can tell you how it runs. And it runs pretty good, let me tell you.

This is the second full day that thats its been installed and it has thus taken two backups already of each computer ( A 30 and a 40 gig). The backups take up only 30 gigs at the moment. And I don’t expect it to rise very fast either as duplicate files (eg windows install files) are only copied once.

 The server has all my downloads, photos and music on it as well, leaving only essential stuff on the PC’s. The printer is installed as well and works fine.

Well, nearly works fine. The All-in-One bit is not working. I can’t scan from any other PC. Now this would be relitively simple to remedy by adding USB Port Sharing to the Home Connector software that needs to be installed on client PC’s ( this, in fact, has all sorts of ramifications for USB devices – how about shared ReadyBoost Drives in Vista?). I’ve submitted a suggstion on this and voted on every similar suggestion relating to getting All-in-One printers to work.

The Home Connector software is really, really well put together. Without having to drag your screen, keyboard and mouse over to the server, you can control nearly everything about it. With emphasis on nearly. Control freaks will be left with somthing to be desired. Till yesterday the Connector and the Console were out of sync with the Connector telling me my network health was affected and the console telling me it was fine. But that seems to have gone now.

Even though the server is next door in the office and shares a wall with my room, I can still hear the fan whirring in the dead of night. Blasted thing – serves me right for buying a Dell server 🙂 . So, to allow me to sleep and safe gurad my sanity, I’ve set it to take back up during the day  when I know that the server and the other PC’s will be on. By default its set to take back-ups between 12am and 6 am and changing this upsets the server a bit, its still works fine. Just make sure that the pc’s in question are fully switched on before the back windows opens. I switched my laptop on a bit to late this morning and a backup started the moment the Connector started talking with the server, slowing the startup process greatly.

I was browsing the fourms last night and discovered that people are building multi -terabyte Home Servers. The current record as last check was 2.06 Tb. Whoever that was must really have serious storage requirements a la Google. Or  the National Security Agency. At 220Gb, my home server is running smoothly. I still ahve 160 Gb’s left. Unless I get really pushed for space I’m not buying any more hard drives. I’ve turned off the folder duplication freature for the moment while I figure out what to do about it.

Now, WHS is based primarily on Windows Server 2003 meaning that any application designed to run on Windoes Server has the potential to run on WHS. So. I’m taking my chances and ahve install Windows Virtual Server. And I’m in the middle of mounting Exchange 2007 (Beta 2, I think) as a Virtual Hard Drive. It will be very interesting to see if it works. The idea of WHS is to keep stuff you use on multimple PC’s in the one place (which is why you can access your personel folder from any other PC where you are logged in with the same Username and Password), so why not extend that to email? More than once I’ve got into the bind of needing my  Desktop PC emails while I’m working on the Laptop, or vice versa. Since its running in a Virtual Machine, I doubt it’ll harm WHS’s processes. If this works, it follows to ask what else we can get away with running on Virtual Server? Buy one server, get one or two free – why not?

 Already, after a few short days of use I can see the appeal to WHS to people. We live increasingly busy lives and have less time to deal with things like backups. WHS automates the whole process. With Norton Ghost I had to keep fiddling around with it to get it to do what I wanted it to do. WHS just works. Oh and Ghost insisted on moving 10 Gig files around my network for each backup – WHS doesn’t.

I can already see my self buying a release copy of WHS.

Net Neutrality

Just read this post on Doc Searls about this.

Basically, the net should be a base service along with telephony and cable ( or, given the rise of VOIP and You Tube, as a service on which telephony and TV/cable runs) .

But that is not always the case simply becuase the wide variety of different companies who own different line networks each  work to keep the others off thier network. Confused? I sympathise. Thst may be the case in the US. Here in the Uk, the issue is slightly simpler, though no less touchy.

Till about 15 years ago, the whole telephone network was nationalised under British Telecommunications ( most other services such as gas and electricity were also nationalised). So today all the physical network  cables are owned by BT. But there is a huge number of companies that offer phone and broadband services. They do this by “buying” x thousand lines from BT every month that they ,in turn, sell to their customers. However, to get things done (i.e. installing an extra phone line or  a braodband line) thoise companies still have to work though BT. So even though the lines are guarenteed to thes carriers by law, BT discourages people from switching by dragging its feet when it comes to sending engineers out.

I know this becuase we just moved house two months ago. BT told us it would take them 3 days max to have our broadband line connected if we went with them. We went with another carrier and it took 6 weeks . In fact the reason why I had no broadband for 6 months was becuase BT dragged its feel replacing the Broadband line.

So, although the little guy might not see this as much of a big deal, business cannot rely on such service and are forced to go with BT in order to guarentee a phone and broadband line.

Doc Searls:

First, the Net is a vast set of connections on which countless services can be deployed. Telephony and television are just two. Because telephone and cable companies offer Internet connections as a secondary “service” on top of their primary businesses, we tend to think of the Net in the same terms. This is a mistake. The Internet will in the long run become a base-level utility, and we will come to regard telephony and television as two among many categories of data supported by that utility.

Second, the end-to-end nature of the Net puts everybody on it in a position to both produce and consume. It is not just about consumption. It is at least as much about production. In the U.S., telephone and cable companies have deployed Net services in asymmetrical and crippled forms from the beginning. While this crippling is easily rationalized (typical usage is asymmetrical, and turning off outbound mail and web service ports discourages spamming), it also serves to discourage countless small and home businesses. Worse, “business-grade service” (symmetrical with no port blockages) is so expensive in most cases that it is essentially prohibited.

Third, most customers in the U.S. face a choice of one or two Internet carriers: their local phone and cable companies. Other providers can only sell services that run on those carriers. (Since the Brand X decision in 2005, phone and cable companies can keep any of these other providers off their lines if they want to.) In many areas (such as mine), only one company provides “high speed” Net access. There is no choice, and there is no competition.

Windows Home Server Install, Part 5

Well, I got the server this afternoon and set it up. All that remains to be done is to install the printer, which I will leave to tommorrow. Once I disconnected the second hard drive I ‘d added the install went fine. I’ve no idea why. But then, again, thats why I’m beta testing.

The server install stopped temporarily when I forgot to return the install DVD to the drive after using my driver CD for the ethernet and bus devices. I simply restarted the server with the DVD in the drive and the install carried on fine. If you need to know, it stopped at the “installing WinFX” bit.

First impressions: very impressed.

Adding usernames and opassword could not be easier. I have the same username/password combinations on both of my PC’s and both were connected to the server without a hitch.

Adding a Hard drive was very easy as well. Straightfoward. My mother could do it, it was THAT easy. As far as physicaly adding the hard drive, I’m using Serial ATA hard drives next time; they are much easier to install in a Dell Poweredge Server.

Adding shared folers was easy as well. You just have to be careful of the read/write permissions. By default these are set to Read only. You need to set them t0 read write before you can move files in there. The permissions work seamlessly with Windows Explorer ( pardon me for being slightly ingorant, but I’m not sure whether this is a product of the Home Connector Software or it just is).

The Home Connector software is very easy to install and it finds the server automatically  as an added bonus. The Ui is slick, clean and uncluttered. There is no space wasted to extra config options ( which is why I’m going to have to drag the screen, keyboard and mouse over to the server tomorrow to install my printer). Its well thought out and very Vista-ish.   

At the moment I’m copying a 3.36Gb folder  full of Downloads to the server ( thats after deleting the .iso files for my Suse Solaris and Vista Beta 2 Discs) . This is very sad as its taking an age. When it comes to my +/- 10 GB music folder. I do not want these folders backed up so I’m copying them off first. The other thing I’m doing, given the sheer disc size of my PC is uninstaling pgograms that I don’t need/want.

Right now I want to review the core selling points of the server. That is the backup ( hopefully will not have to test  the restore bit 🙂 ) and the media sharing aspects of it. I have plans to install Virtual server and run an Exchange 2007 server  VM on it. But that is  a week if not more away.

I’ll have to start a “Windows Home Server” series now that the install is done.