VLC, GPL and the Apple App Store

Update:I wrote this post using the WordPress iPhone app. So just got home and corrected some formatting

Today I read (see here) that the successful VLC iPhone app might be pulled from the App store.

The reasoning behind this, apparently is that the App Store Terms Of Service breach the GPL in that all apps are sold with DRM.

First of all, this is lunacy. After 3 years trying to get it in the store, pulling it would cause an uproar. After Apples’ successful weathering of the no flash controversy, that uproar is not going to get apple to remove DRM.

Second, VLC is open source. So, open source the app, or release a DRM free version on the jailbreak app stores. Problems solved.

So my advice to the VLC team is to grin it and bear it. Nobody said the world was perfect.

Sticking to the letter of the GPL may be wonderful for the open source diehards, but the rest of us seriously couldn’t care less.

Windows Azure Feedreader Episode 6: Eat Your Vegetables

A strange title, no doubt, but I’ll explain in a moment.

Firstly, apologies for the delay. I’ve been busy with some other projects that couldn’t be delayed. And yes, I have been using Windows Azure Storage for that.

I’m doing some interesting work with Tropo, the cloud communications platform. It’s similar to Twillo. So, at some point I’ll do a screencast – my contribution to the lack of documentation for Tropo’s C# library.

This weeks episode had the stated intention of testing the Opml upload and storage routine we wrote back in Week 4.

We manage this, reading in the contents of the OPML file, storing the blog metadata in Windows Azure tables and the posts in Blob Storage.

However, in getting there, we have to tackle a number of bugs. Truthfully speaking, a few could have been avoid earlier – such as the fact that calling UrlPathEncode does not remove the ‘+’, and so IIS7 freaks out as a result (e.g. when blob names are used in URLs).  Others, I had no idea about – like the requirement for only lowercase blob container and queue names.

Which brings me to why I’ve named this episode as such. Dave Winer wrote a brilliant post earlier this week about working with Open Formats. To quote him:

1. If it hurts when you do it, stop doing it.

2. Shut up and eat your vegetables.

3. Assume people have common sense.

Number 2 says you have to choose between being a person who hangs out on mail lists talking foul about people and their work, or plowing ahead and making software anyway even though you’re dealing with imperfection. If you’re serious about software, at some point you just do what you have to do. You accept that the format won’t do everything anyone could ever want it to do. Realize that’s how they were able to ship it, by deciding they had done enough and not pushing it any further.

So, this episode is about doing exactly that – shutting up about the bugs and getting the software to work.

The fact is that every episode so far has been about writing the foundations upon which we can build. The bugs solved in this episode, mean we have few problems down the road. We have been immersed in the nitty gritty details, rather than build with out having to worry if our feed details are really being read and stored, or if our tables are being created correctly, or if blobs are being stored in the right containers.

Enjoy (or not) my bug fixing:

Remember to head over to Vimeo.com to see the episode in all its HD glory.

A word about user Authentication. While I don’t cover it in the video, I’ll be moving to use Google Federated Authentication over Windows Live ID. So for next week, I’ll have the Windows Live Stuff removed, and we’ll be using Google and the Forms Authentication integration the dontnetopenauth library provides.

Next week, the order of business is as follows:

  1. Clean up our view HTML
  2. ViewData doesn’t work in a couple of places – so we need to fix that
  3. Store user subscriptions.
  4. Make a start on our update code

PS. I added new pre-roll this week as an experiment. Hope you like it.

Creating Video Pre-Roll with Expression Design, Blend and Encoder

The only part of Expression Studio 3 I’ve really had a chance to work with is Expression Encoder 3 for the screencast work. So I thought I’d have some fun with the other parts of Expression Studio, namely Design and Blend.

Now, it took a bit to time to figure out, but its much easier designing your assets in Design than it is in Blend. Blend is for the animation. Its actually pretty easy to use, its well though out and all. So, once I figured that out, things were much easier.

Now, I really didn’t do anything too adventurous. Just making basic shapes on a canvas and animating them. No events, triggers or anything like that.

So I decided to do a pre-roll for the screen casts. Heres what I came up with:

If you want to do your own, its really easy.

  1. Using Expression Design, design the assets on the canvas. Make sure the canvas dimensions are the same as the resolution of the video (800 x 600, 1280 x 720, etc). This is important.
  2. Go to File->Export and chose “XAML WPF Canvas” as the format under Export Properties.
  3. In Blend, create a new WPF Application. Right click on the solution file and select “Link to Existing File”. Navigate to and select the file your exported earlier.
  4. Animate it by creating a new Storyboard. This will turn on recording.
  5. Save your file.

We now have our XAML canvas for our pre-roll. Now, using Expression Encoder, we can add this as a visual overlay to any video of our choice. However, we need video to overlay the XAML on to – as far as I know, you can’t add it any other way.

So go to Windows Live Movie Maker. Add credits to your new project. Delete the text. Choose a clip length – I chose 5 seconds. But This should be the same length as your Animation. Save your video. Make sure its the same resolution as your XAML Canvas.

  1. Go to Expression Encoder, Import the video from Windows Live Movie Maker.
  2. Got to Timeline->Add Visual Overlay. Navigate to the XAML file.
  3. Resize the overlay till it covers the entire screen.
  4. If you want to, add sound by going to Timeline->Add Audio Overlay.
  5. Hit encode.

It should be noted that you can only Import a XAML canvas as a visual overlay on Expression Encoder. And thats why we chose “XAML WPF Canvas” when we exported from Design.

If I have any upload quota left after this weeks Windows Azure Feedreader Screencast, I’ll add a short screencast demonstrating this.

Have fun.

Core Competencies and Cloud Computing

Wikipedia defines Core Competency as:

Core competencies are particular strengths relative to other organizations in the industry which provide the fundamental basis for the provision of added value. Core competencies are the collective learning in organizations, and involve how to coordinate diverse production skills and integrate multiple streams of technologies. It is communication, an involvement and a deep commitment to working across organizational boundaries.

 

So, what does this have to do with Cloud Computing?

I got thinking about different providers of cloud computing environments. If you abstract away the specific feature set of each provider what were the differences remaining that set these providers apart from each other.

Now, I actually starting thinking about this backwards. I asked myself why Microsoft Windows Azure couldn’t do a Google App Engine and offer free applications. I had to stop myself there and go off to wikipedia and remind myself of the quotas that go along with an App Engine free application:

 

Hard limits

Apps per developer
10

Time per request
30 sec

Blobstore size (total file size per app)
2 GB

HTTP response size
10 MB

Datastore item size
1 MB

Application code size
150 MB

Free quotas

Emails per day
2,000

Bandwidth in per day
1,000 MB

Bandwidth out per day
1,000 MB

CPU time per day
6.5 hours per day

HTTP Requests per Day
1,300,000*

Datastore API calls per day
10,000,000*

Data stored
1 GB

URLFetch API calls per day..
657,084*

Now the reason why i even asked this question, was the fact that I got whacked with quite a bit of a bill for the original Windows Azure Feed Reader I wrote earlier this year. That was for my honours year university project, so I couldn’t really complain. But looking at those quotas from Google, I could have done that project many times over for free.

This got me thinking. Why does Google offer that and not Microsoft? Both of these companies are industry giants, and both have boatloads of CPU cycles.

Now, Google, besides doing its best not to be evil, benefits when you use the web more.  And how do they do that? They go off and create Google App Engine. Then they allow the average dev to write an app they want to write and run it. For free. Seriously, how many websites run on App Engine’s free offering?

Second, Google is a Python shop. Every time someone writes a new Library or comes up with a novel approach to something, Google benefits. As Python use increases, some of that code is going to be contributed right back into the Python open source project. Google benefits again. Python development is a Google Core competency.

Finally, Google is much maligned for its approach to software development: thrown stuff against the wall and see what sticks. By giving the widest possible number of devs space to go crazy, the more apps are going to take off.

So, those are all Googles core competencies:

  1. Encouraging web use
  2. Python
  3. See what sticks

And those are perfectly reflected in App Engine.

Lets contrast this to Microsoft.

Microsoft cater to writing line of business applications. They don’t mess around. Their core competency, in other words, is other companies IT departments. Even when one looks outside the developer side of things, one sees that Microsoft office and windows are all offered primarily to the enterprise customer. The consumer versions of said products aren’t worth the bits and bytes they take up on disk. Hence, windows Azure is aimed squarely at companies who can pay for it, rather than enthusiasts.

Secondly, Windows Azure uses the .Net Framework, another uniquely Microsoft core competency.  With it, it leverages the C# language. Now, it  is true that .net is not limited to Windows, nor is Windows Azure  a C# only affair. However, anything that runs on Windows Azure leverages the CLR and the DLR. Two pieces of technology that make .Net tick.

Finally, and somewhat  related, Microsoft has a huge install base of dedicated Visual Studio users. Microsoft has leveraged this by creating a comprehensive suite of Windows Azure Tools.

Hopefully you can see where I’m going with this. Giving stuff away for free for enthusiasts to use is not a Microsoft core competency. Even with Visual Studio Express, there are limits. Limits clearly defined by what enterprises would need. You need to pay through the nose for those.

So Microsoft core competencies are:

  1. Line of Business devs
  2. .Net, C# and the CLR/DLR
  3. Visual Studio

Now, back to what started this thought exercise – Google App Engines free offering. As you can see its a uniquely Google core competency, not a Microsoft one.

Now, what core competencies does Amazon display in Amazon Web Services?

Quite simply, Amazon doesn’t care who you are or what you want to do, they will provide you with a solid service at a very affordable price and sell you all the extra services you can handle. Amazon does the same things with everything else, so why not cloud computing. Actually, AWS is brilliantly cheap. Really. This is Amazon’s one great core competency and they excel at it.

So, back to what started this thought exercise – a free option. Because of its core Competencies, Google is uniquely positioned to do it. And by thinking about it, Microsoft and Amazon’s lack of a similar offering becomes obvious.

Also, I mentioned the cost of Windows Azure.

Google App Engine and its free option mean that university lecturers are choosing to teach their classes using Python and App Engine rather than C# and Windows Azure.

Remember what a core competency is. Wikipedia defines Core Competency as:

Core competencies are particular strengths relative to other organizations in the industry which provide the fundamental basis for the provision of added value. Core competencies are the collective learning in organizations, and involve how to coordinate diverse production skills and integrate multiple streams of technologies. It is communication, an involvement and a deep commitment to working across organizational boundaries.

I guess the question is, which offering make the most of their parent companies core competencies? And is this a good thing?

Windows Azure Feedreader Episode 5: User Authentication

Firstly, apologies for being late with  this episode. Real life presented some challenges over the weekend.

This weeks episode focuses on the choice of user authentication systems. As I mentioned last week, there is a choice to be made between Google Federated login and Windows Live ID.

So, this week, I implement both systems.

It should be noted that I only do the basics for Google Federated Login – that is, only the openID part of the process. We’ll leave OAuth till later.

If you read my earlier post, I was still deliberating on which to use. Having actually worked with the dontnetopenauth library in an application centric manner, it does seem to be more appealing. Because it integrates nicely with Forms authentication, it lends itself to MVC. Also, because of this, having dual login systems isn’t going to be possible. So we have to choose one of them.

So next week, we’ll be removing the code  for the loser. As i said above, I’m leaning toward Google Federated Login.

So this week we cover:

Here’s the show:

And remember, you’ll have to go to vimeo.com to see the full HD version.

I had planed to test our OPML code that we wrote last week instead, but we’ll do that for next weeks episode. As  bonus, we can do it properly with full integration with user information.

About next weeks episode, I’m busy all weekend. The next episode may or may not appear on time next tuesday.

Apple TV Announcement: More Questions than Answers

So, heres what I said a few weeks ago:

The rumours are:

Supposedly will be priced at $99 which isn’t too bad of a price

It will basically be a little, iPhone 4 without the phone or screen…

The insides supposedly will be iPhone 4 like:  A4 CPU, 16GB of flash storage

Supposedly can only handle up to 720p video

Apple will be officially changing the name of the device to iTV…

I just left this reply to this post on the GeekTonic blog discussing the AppleTv rumors that will not die.

Well… 1080i or p is only really viable if you have cable internet… and its a really small market. So 720p is pretty much a good bet as the default res.

I don’t care what anybody says – I ain’t streaming movies. Not with a 15gb fair use download cap. I’m getting a local copy of everything. Download once re-use all over the house. However, a 16Gb capacity is barely enough for the photos I have on the Apple Tv. Currently I play all my TV shows off the server rather than sync them. So while not being able to cache much content locally on the device, as long as i can download a copy to the server, I’m happy.

An App Store would be great – though I would assume it wouldn’t be backwards compatible with older AppleTvs – since it would require apps to be recompiled ( or even re-written) for the older hardware/CPUs.

The $99 price point is also awesome. It does open up the market. Its under the psychological $100 barrier – so people will be more likely to buy it.

The form factor is a persistent rumour – however I don’t see the logic of it. Given the clutter in the TV closet, there’s a real chance of me loosing it. However it will be a boon to those who already have too many set-top boxes in their Tv closets – look out for Steve to mention this prominently in any event. I still think its possible that Apple may keep the current form factor in some way.

However, unless the new Apple Tv ( or as you say, iTV) launches with some really awesome apps that will be worth the outlay of $99, I can’t see myself rushing out to buy one. Since we got satellite Tv installed with the accompanying HD-PVR a few weeks back, the use of the AppleTv has declined a lot – like 2 or 3 times a week as opposed to every day. Purchases sinc3e can be counted on one hand.
So a bit of a mixed reaction to this.

I was watching the announcement purely for the Apple Tv announcement.

apple-fall-2010_0272[1]

First of all, what Steve never mentioned:

  • Steve never mentioned details as to innards of the Apple TV
  • Steve never mentioned whether it’ll handle 1080p or just 720p as it does currently
  • Steve never mentioned whether content purchases will still be available in iTunes on the desktop
  • Steve never mentioned whether this software will be backwards compatible with the current hardware.
  • Steve never mentioned why there is a USB port on the back of the Apple TV.
  • Steve never mentioned Ping for the AppleTv

The Apple Store page answers none of these (burning) questions. Until I have the answers to these, the new Apple Tv is going to linger in geek limbo – neither here nor there.

What he did mention was (and we are not surprised at):

apple-fall-2010_0307[1]

 

While this is a bit of a let down. There is one thing that I’m happy about: no storage management. Currently, its a bit of a pain.

Like i said, as long as I can have a local copy somewhere, I’m happy with the move to streaming. Now, i can get some killer speeds out of mmy ADSL connection. If I use Steam or Microsoft File Transfer Manager i can get 600Kbps. But iTunes and ordinary downloads are still slow. So I’m hoping that the Apple Tv will be able to take advantage of these speeds to bring seamless streaming.

Update: Fraiser Spiers makes the point that this may well be iOS on the new Apple TV, but nobody’s telling. Again backwards compatibility with the current Apple Tv will be telling.

Not surprised at no Apps or iOS. However, if the Apple TV runs an A4 CPU , you can bet your bottom dollar iOS will come soon. So if this is the case, I don’t think that Apple will offer upgrades to current Apple TV’s.

apple-fall-2010_0348[1]

The new price is nice. Suddenly, upgrading dosen’t look like such a big deal.

 

apple-fall-2010_0312[1]

Need i say more?

apple-fall-2010_0326[1]

Since I’m in the UK, Netflix isn’t such a big deal. But there are UK providers of on demand streaming movies such as Sky TV. And it will be interesting to see if Apple does deals with these companies.

apple-fall-2010_0319[1]

apple-fall-2010_0322[1]

The UI is almost the same. As you can see, the button to add to wishlist is much more prominent now. Which is actually a good thing.

apple-fall-2010_0337[1]

AirPlay from any iOS 4.1 device to the Apple TV sounds awesome. Exactly what content can be streamed? iTunes store only?

Its still called the Apple TV, which is probably a good thing. It is possible, however, that the name change will come when the change to iOS occurs.

So, the jury is still very much out on the new Apple Tv

apple-fall-2010_0388[1]

Thanks to GDGT for the pictures.

Windows Azure Feedreader: Choosing a Login System – Which would you choose?

Update: Here’s the related screencast episode.

As you may have noticed in the last episode (episode4), writing the Feed Reader has got to the stage where we require UserID’s.

Given the delicate nature of login credentials and the security precautions required, its much easier to hand off the details to Google Federated Login, or even Windows Live ID. These services simply give us a return token indicating who has logged in.

The previous version of the feed reader used Windows Live ID. Its a very simple implementation. It consists of a single MVC controller, and a small iFrame containing the login button. It’s elegantly simple. Since its MVC, there are no issue running it on Windows Azure. The reason why I picked it the last time, was a) its simplicity and b) its part of the Windows Azure ecosystem.

The alternative is to use Google Federated Login. This is a combination of OpenID and OAuth. The implementation is certainly much more involved, with a lot of back and forth with Google’s Servers.

OpenIdDiagram[1]

 

  1. The web application asks the end user to log in by offering a set of log-in options, including using their Google account.
  2. The user selects the “Sign in with Google” option. See Designing a Login User Interface for more options.
  3. The web application sends a “discovery” request to Google to get information on the Google login authentication endpoint.
  4. Google returns an XRDS document, which contains the endpoint address.
  5. The web application sends a login authentication request to the Google endpoint address.
  6. This action redirects the user to a Google Federated Login page, either in the same browser window or in a popup window, and the user is asked to sign in.
  7. Once logged in, Google displays a confirmation page (redirect version / popup version) and notifies the user that a third-party application is requesting authentication. The page asks the user to confirm or reject linking their Google account login with the web application login. If the web application is using OpenID+OAuth, the user is then asked to approve access to a specified set of Google services. Both the login and user information sharing must be approved by the user for authentication to continue. The user does not have the option of approving one but not the other.Note: If the user is already logged into their Google account, or has previously approved automatic login for this web application, the login step or the approval step (or both) may be skipped.
  8. If the user approves the authentication, Google returns the user to the URL specified in the openid.return_to parameter of the original request. A Google-supplied identifier, which has no relationship to the user’s actual Google account name or password, is appended as the query parameter openid.claimed_id. If the request also included attribute exchange, additional user information may be appended. For OpenID+OAuth, an authorized OAuth request token is also returned.
  9. The web application uses the Google-supplied identifier to recognize the user and allow access to application features and data. For OpenID+OAuth, the web application uses the request token to continue the OAuth sequence and gain access to the user’s Google services.Note: OpenID authentication for Google Apps (hosted) accounts requires an additional discovery step. See OpenID API for Google Apps accounts.

 

As you can see, an involved process.

There is a C# library available called  dontnetopenauth, and I’ll be investigating the integration of this into MVC and its use in the Feed Reader.

There is one advantage of using Google Accounts, and that’s the fact that  the Google Base Data API lets us import Google Reader Subscriptions.

It may well be possible to allow the use of dual login systems. Certainly, sites like stackoverflow.com use this to great effect.

Why is choosing an external login system important?

Well, firstly its one less username and password combination that has to be remembered.

Secondly, security considerations are onus of the authentication provider.

If we were to go with multiple authentication providers, I’d add a third reason: Not having an account with the chosen authentication provider is a source of frustration for users.

So, the question is, dear readers, which option would you choose?

  1. Google Federated login
  2. Windows Live ID
  3. Both

Windows Azure Feed Reader Episode 4: The OPML Edition

As you’ve no doubt surmised from the title, this weeks episode deals almost entirely with the OPML reader and fitting it in with the rest of our code base.

If you remember, last week I showed a brand new version of the OPML reader code using LINQ and Extension Methods. This week, we begin by testing said code. Given that its never been been tested before, bugs are virtually guaranteed. Hence we debug the code, make the appropriate changes and fold the changes back into our code base.

We then go on to creating an Upload page to upload the OPML file. We store the OPML file in a Blob and drop a message in a special queue for this OPML file to be uploaded. We make the changes to WorkerRole.cs to pull that message off the queue and process the file correctly, retrieve the feeds and store them. If you’ve been following along, none of this code will be earth shattering to you either.

The fact is that a) making the show any longer would bust my Vimeo Basic upload limit  and b) I couldn’t think of anything else to do that could be completed in ~10 minutes.

The good thing is that we’re back to our 45 minute-ish show time, after last weeks aberration.

Don’t forget, you can head over to Vimeo to see the show in all its HD glory: http://www.vimeo.com/14510034

After last weeks harsh lessons in web-casting, file backup and the difference between 480p and 720p when displaying code, this weeks show should go perfectly well.

Enjoy.

Remember, the code lives at http://windowsazurefeeds.codeplex.com

PS. Some occasional interference in the sound. I’m wondering if the subwoofer is causing it while I’m recording it. Apologies.