Marquee de Sells: Chris's insight outlet for category 'spout' via ATOM 1.0 csells on twitter

You've reached the internet home of Chris Sells, who has a long history as a contributing member of the Windows developer community. He enjoys long walks on the beach and various computer technologies.




Moving My Site to Azure: ASP.NET MVC 2

In our last episode, I talked about the joy and wonder that is moving my site’s ISP-hosted SQL Server instance to SQL Azure. Once I had the data moved over and the site flipped to using the new database, I needed to move the site itself over, which brought joy and wonder all it’s own.

Moving to Visual Studio 2013

I haven’t had to do any major updates to my site since 2010 using Visual Studio 2010. At that time, the state of the art was ASP.NET MVC 2 and Entity Framework 4, which is what I used. And the combination was a pleasant experience, letting me rebuild my site from scratch quickly and producing a site that ran like the wind. In fact, it still runs like the wind. Unfortunately, Visual Studio 2012 stopped supporting MVC 2 (and no surprise, Visual Studio 2013 didn’t add MVC 2 support back). When I tried to load my web site project into Visual Studio 2013, it complained:

image

This version of Visual Studio is unable to open the following projects

 

This error message lets me know that there’s a problem and the migration report provides a handy link to upgrade from MVC 2 to MVC 3. The steps aren’t too bad and there’s even a tool to help, but had I followed them, loading the new MVC 3 version of my project into Visual Studio 2013 would’ve given me another error with another migration report and a link to another web page, this time helping me move from MVC 3 to MVC 4 because VS2013 doesn’t support MVC 3, either. And so now I’m thinking, halfway up to my elbows in the move to MVC 3 that Visual Studio 2013 doesn’t like, that maybe there’s another way.

It’s not that there aren’t benefits to move to MVC 4, but that’s not even the latest version. In fact, Microsoft is currently working on two versions of ASP.NET, ASP.NET MVC 5 and ASP.NET v.Next. Even if I do move my site forward two version of MVC, I’ll still be two versions behind. Of course, the new versions have new tools and new features and can walk my dog for me, but by dropping old versions on the floor, I’d left with the choices of running old versions of Visual Studio side-by-side with new ones, upgrading to new versions of MVC just to run the latest version of VS (even if I don’t need any of the new MVC features) or saying “screw it” and just re-writing my web site from scratch. This last option might seem like what Microsoft wants me to do so that they can stop supporting the old versions of MVC, but what’s to stop me from moving to AWS, Linux and Node instead of to ASP.NET v.Next? The real danger of dropping the old versions on the floor; not that I’ll move over to another platform, because I’m an Microsoft fanboy and my MSDN Subscription gives me the OS and the tools for free, but that large paying customers say “screw it” and move their web sites to something that their tools are going to support for more than a few years.

Luckily for me, there is another way: I can cheat. It turns out that if I want to load my MVC 2 project inside of Visual Studio 2013, all I have to do is remove a GUID from the csproj file inside the ProjectTypeGuids element. The GUID in question is listed on step 9 of Microsoft’s guide for upgrading from MVC 2 to MVC 3:

image

Removing {F85E285D-A4E0-4152-9332-AB1D724D3325} from your MVC 2 project so it will load in Visual Studio 2013

 

By removing this GUID, I give up some of the productivity tools inside Visual Studio, like easily adding a new controller. However, I’m familiar enough with MVC 2 that I no longer need those tools and being able to actually load my project into the latest version of Visual Studio is more than worth it. Andrew Steele provides more details about this hack in his most excellent StackOverflow post.

Now, to get my MVC 2 project to actually build and run, I needed a copy of the MVC 2 assemblies, which I got from NuGet:

image

Adding the MVC 2 NuGet package to my project inside Visual Studio 2013

 

With these changes, I could build my MVC 2 project inside Visual Studio 2013 and run on my local box against my SQL Azure instance. Now I just need to get it up on Azure.

Moving to Azure

Publishing my MVC 2 site to Azure was matter of right-clicking on my project and choosing the Publish option:

image

Publishing a web site to Azure using the Solution Explorer’s Publish option inside Visual Studio 2013

 

Selecting the Windows Azure Web Sites as the target and filling in the appropriate credentials was all it took to get my site running on Azure. I did some battle with the “Error to use a section registered as allowDefinition='MachineToApplication' beyond application level” bug in Visual Studio, but the only real issue I had was that Azure seemed to need the “Precompile during publishing” option set or it wasn’t able to run my MVC 2 views when I surfed to them:

 

image

Setting the “Precompile during publishing” option for Azure to run my MVC 2 views

 

With that setting in place, my Azure site just ran at the Azure URL I had requested: http://sellsbrothers.azurewebsites.net.

Where are we?

I’m a fan of the direction of ASP.NET v.Next. The order of magnitude reduction in working set, the open source development and the use of NuGet to designate pieces of the framework that you want are all great things. My objection is that I don’t want to be forced to move forward to new versions of a framework if I don’t need the features. If I am forced, then that’s just churn in working code that’s bound to introduce bugs.

Tune in next time and we’ll discuss the fun I had configuring the DNS settings to make Azure the destination for sellsbrothers.com and to add SSL to enable secure login for posting articles via AtomPub and Windows Live Writer.

0 comments




Moving My Site to Azure: The Database

In a world where the cloud is not longer the wave of the future, but the reality of the present, it seems pretty clear that it’s time to move sellsbrothers.com from my free ISP hosting (thanks securewebs.com!) to the cloud, specially Microsoft’s Azure. Of course, I’ve had an Azure account since its inception, but there has been lots of work to streamline the Azure development process in the last two years, so now should be the ideal time to jump in and see how blue the waters really are.

As with any modern web property, I’ve got three tiers: presentation, service and database. Since the presentation tier uses server-side generated UI and it’s implementation is bundled together with the service tier, there are two big pieces to move – the ASP.NET site implementation and the SQL Server database instance. I decided to move the database first with the idea that once I got it hosted on Azure, I can simply flip the connection string to point the existing site to the new instance while I was doing the work to move the site separately.

Deploy Database To Windows Azure SQL Database from SSMS

The database for my site does what you’d expect – it keeps track of the posts I make (like this one), the images that go along with each post, the comments that people make on each post, the writing and talks I give (shown on the writing page), book errata, some details about the navigation of the site, etc. In SQL Server Management Studio (SSMS), it looks pretty much like you’d expect:

image

sellsbrothers.com loaded into SQL Server Management Studio

 

However, before moving to Azure SQL Server, I needed a SQL Azure instance to move the data to, so I fired up the Azure portal and created one:

image

Creating a new SQL Azure database

 

In this case, I chose to create a new SQL Azure instance on a new machine, which Azure will spin up for us in a minute of two (and hence the wonder and beauty that is the cloud). I choose the Quick Create option instead of the Import option because the Import option required me to provide a .bacpac file, which was something I wasn’t familiar with. After creating the SQL Server instance and the corresponding server, clicking on the new server name (di5fa5p2lg in this case) gave me the properties of that server, including the Manage URL:

image

SQL Azure database properties

 

If you click on the Manage URL, you will have a web interface for interacting with your SQL Azure server, but more importantly for this exercise, the FQDN is what I needed to plug into SSMS so that I can connect to that server. I’ll need that in a minute, because in the meantime, I’d discovered what looked like the killer feature for my needs in the 2014 edition of SSMS:

image

Deploy Database to Windows Azure Database in SSMS 2014

 

By right-clicking on the database on my ISP in SSMS and choosing Tasks, I had the Deploy Database To Windows Azure SQL Database option. I was so happy to choose this option and see the Deployment Settings screen of the Deploy Database dialog:

Untitled-2

SSMS Deploy Database dialog

 

Notice the Server connection is filled in with the name of my new SQL Server instance on Azure. It started blank and I filled it in by pushing the Connect button:

Untitled-7

SSMS Connect to Server dialog

 

The Server name field of the Connect to Server dialog is where the FQDN we pulled from the Manage URL field of Azure database server properties screen earlier and the credentials are the same as I set when I created the database. However, filling in this dialog for the first time gave me some trouble:

Untitled-8

SQL Azure: Cannot open server ‘foo’ requested by the login

 

SQL Azure is doing the right thing here to keep your databases secure by disabling access to any machine that’s not itself managed by Azure. To enable access from your client, look for the “Set up Windows Azure firewall rules for this IP address” option on the SQL database properties page in your Azure portal. You’ll end up with a server firewall rule that looks like the following (and that you may want to remove when you’re done with it):

image

SQL Azure server firewall rules

 

Once the firewall has been configured, filling in the connection properties and starting the database deployment from my ISP to Azure was when my hopes and dreams were crushed:

image

SSMS Deploy Database: Operation Failed

 

Clicking on the Error links all reported the same thing:

Untitled-4

Error validating element dt_checkoutobject: Deprecated feature ‘String literals as column aliases’ is not supported by SQL Azure

 

At this point, all I could think was “what the heck is dt_checkoutobject” (it’s something that Microsoft added to my database), what does it mean for to use string literals as column aliases (it’s a deprecated feature that SQL Azure doesn’t support) and why would Microsoft deprecate a feature that they used themselves on a stored proc that they snuck into my database?! Unfortunately, we’ll never know the answer to that last question. However, my righteous indignation went away as I dug into my schema and found several more features that SQL Azure doesn’t support that I put into my own schema (primarily it was the lack of clustered indexes for primary keys, which SQL Azure requires to keep replicas of your database in the cloud). Even worse, I found one table that listed errata for my books that didn’t have a primary key at all and because no one was keeping track of data integrity, all of the data was in that table twice (I can’t blame THAT on Microsoft : ).

And just in case you think you can get around these requirements and sneak your database into SQL Azure w/o the updates, manually importing your data using a bacpac file is even harder, since you now have to make the changes to your database before you can create the bacpac file and you have to upload the file to Azure’s blob storage, which requires a whole other tool that Microsoft doesn’t even provide.

Making your Database SQL Azure-compatible using Visual Studio

To make my SQL database compatible with SQL Azure required changing the schema for my database. Since I didn’t want to change the schema for a running database on my ISP, I ended up copying the database from my ISP onto my local machine and making my schema changes there. Getting to point of SQL Azure-compatibility, however, required me to have the details of which SQL constructs SQL Azure supported and didn’t support. Microsoft provides overview guidance on the limitations of SQL Azure, but it’s not like having an automated tool that can check every line of your SQL. Luckily, Microsoft provides such a tool built into Visual Studio.

To bring Microsoft’ SQL compiler to bear to check for SQL Azure compatibility requires using VS to create a SQL Server Database Project and then pointing it at the database you’d like to import from (which is the one copied to my local machine from my ISP in my case). After you’ve imported your database’s schema, doing a build will check your SQL for you. To get VS to check your SQL for Azure-compatibility, simply bring up the project settings and choose Windows Azure SQL Database as the Target platform:

image

Visual Studio 2014: Setting Database Project Target Platform

 

With this setting in place, compiling your project will tell you what’s wrong with your SQL from an Azure point-of-view. Once you’ve fixed your schema (which may require fixing your data, too), then you can generate a change script that updates your database in-place to make it Azure-compatible. For more details, check out Bill Gibson’s excellent article Migrating a Database to SQL Azure using SSDT.

The Connection String

Once the database has been deployed and tested (SSMS or the Manage URL are both good ways to test that your data is hosted the way you think it should be), then it’s merely a matter of changing the connection string to point to the SQL Azure instance. You can compose the connection string yourself or you can choose the “View connection strings for ADO.NET, ODBC, PHP and JDBC” option from your database properties page on Azure:

image

SQL Azure: Connection Strings

 

You’ll notice that while I blocked out some of the details of the connection string in my paranoia, that Azure itself is too paranoid to show the password; don’t forget to insert it yourself and to put it into a .config file that doesn’t make it into the SCCS.

Where are we?

In porting sellsbrothers.com from an ISP to Azure, I started with the database. The tools are there (nice tools, in fact), but you’ll need to make sure that your database schema is SQL Azure-compatible, which can take some doing. In the next installment, I’ll talk about how I moved the implementation of the site itself, which was not trivial, as it is implemented in ASP.NET MVC 2, which has been long abandoned by Microsoft.

If you’d like to check out the final implementation in advance of my next post, you help yourself to the sellsbrothers.com project on github. Enjoy.

0 comments




Bringing The Popular Tech Meetups to Portland

pdx-tech-meetup-logoI’ve been watching the Portland startup scene for years. However, in the last 12 months, it’s really started to take off, so when I had an opportunity to mentor at the recent Portland Startup Weekend, I was all over it. I got to do and see all kinds of wonderful things at PDXSW, but one of the best was meeting Thubten Comerford and Tyler Phillipi. Between the three of us, we’re bringing the very popular Tech Meetup conference format to Portland.

The idea of a Tech Meetup is meant to be focused on pure tech. In fact, at the largest of the Tech Meetups in New York (33,000 members strong!), they have a rule where it’s actually rude to ask about the business model. The Tech Meetups are tech for tech’s sake. If you’re in a company big or small or if you’re just playing, cool tech always has a place at the Portland Tech Meetup.

The format is simple and if you’re familiar with the way they do things in Boulder or Seattle, you’re already familiar with it. Starting on January 20th, 2014, every 3rd Monday at 6pm, we’ll open the doors for some networking time, providing free food and drink to grease the skids. At 7pm, we’ll start the tech presentation portion of the evening, which should be at least five tiny talks from tech presenters of all kinds. After the talks, we’ll wrap up around 8pm and then head to the local water hold for the debrief.

If this sounds interesting to you, sign up right now!

If you’d like to present, drop me a line!

If you’d like to sponsor, let Thubten know.

We’re very excited about bringing this successful event to Portland, so don’t be shy about jumping in; the water is fine…

0 comments




GUI REPL for Roslyn

If you recall from REPL for the Rosyln CTP 10/2011, I’ve been playing around building a little C# REPL app using Roslyn. That version was built as a Console application, but I’ve refactored and rebuilt it as a WPF application:

image

You can download the source code for both the Console and the WPF versions here:

RoslynRepl Sample Download

The benefit of a real GUI app is that output selection makes a lot more sense and that you could imagine real data visualization into data controls instead of just into strings. However, implementing a REPL shell in a GUI environment requires doing things considerably differently than in a Console app. Besides the stupid things I did, like doing a lot of Console.Write, and things that don’t make sense, like #exit or #prompt, there are a few interesting things that I did with this code, including handling partial submissions, rethinking history and rewiring Console.Write (just ‘cuz it’s stupid when I do it doesn’t mean that it shouldn’t work).

Partial Submissions

In this REPL, I decided that Enter means “execute” or “newline” depending on whether the submission is complete enough, according to Roslyn, to execute or not. If it is, I execute it, produce the output and move focus to either the next or a new submission TextBox. If the submission isn’t yet complete, e.g. "void SayHi() {", then I just put in a newline. Further, I do some work to work properly with selections, i.e. if you press Enter when there’s a selection, the selection will be replaced with the Enter key.

So far I like this model a lot, since I don’t have to something like separate “execute” and “newline” into Enter and Alt+Enter or some such.

Rethinking History

In a GUI shell with partial submissions and multi-line editing, the arrows are important editing keys, so can’t be used for access to previous lines in history. Further, a GUI apps makes it very easy to simply scroll to the command that you want via the mouse or Shift+Tab, so there’s not a lot of use for Alt+Arrow keys. Pressing Enter again replaces the old output (or error) with new output (or error):

imageimage

Currently when you re-execute a command from history, the command stays where it is in the history sequence, but it could as easily move to the end. I haven’t yet decided which I like better.

Redirecting Console.Write

Since this is a REPL environment works and acts like a shell, I expect that Console.Write (and it’s cousins like Console.WriteLine) to work. However, to make that work, I need to redirect standard output:

Console.SetOut(new ReplHostTextWriter(host));

The ReplTextWriterClass simply forwards the text onto the host:

class ReplHostTextWriter : TextWriter {
  readonly IReplHost host;

public ReplHostTextWriter(IReplHost host) { this.host = host; } public override void Write(char value) { host.Write(value.ToString()); } public override Encoding Encoding { get { return Encoding.Default; } } }

The hosts implementation of IReplHost.Write simply forwards it onto the currently executing submission (the ReplSubmissionControl represents both a submission’s input and output bundled together). You’ll notice that the TextWriter takes each character one at a time. It would be nice to do some buffering for efficiency, but you’d also like the output to appear as its produced, so I opted out of buffering.

However, one thing I don’t like is the extra newline at the end of most string output. I want the main window to decide how things are output, setting margins and the newline looks like a wacky margin, so the trailing CR/LF had to go. That’s an interesting algorithm to implement, however, since the characters come in one at a time and not line-by-line. I want separating newlines to appear, just not trailing newlines. I implement this policy with the TrimmedStringBuilder class:

// Output a stream of strings with \r\n pairs potentially spread across strings,
// trimming the trailing \r and \r\n to avoid the output containing the extra spacing.
class TrimmedStringBuilder {
  readonly StringBuilder sb;

  public TrimmedStringBuilder(string s = "") {
    sb = new StringBuilder(s);
  }

  public void Clear() {
    sb.Clear();
  }

  public void Append(string s) {
    sb.Append(s);
  }

  public override string ToString() {
    int len = sb.Length;

    if (len >= 1 && sb[len - 1] == '\r') {
      len -= 1;
    }
    else if (len >= 2 && sb[len - 2] == '\r' && sb[len - 1] == '\n') {
      len -= 2;
    }

    return sb.ToString(0, len);
  }
}

Usage inside the ReplSubmissionControl.Write method is like so:

public partial class ReplSubmissionControl : UserControl {
...
  TrimmedStringBuilder trimmedOutput = new TrimmedStringBuilder();

  public void Write(string s) {
    if (s == null) { trimmedOutput.Clear(); }
    else { trimmedOutput.Append(s); }

    consoleContainer.Content = GetTextControl(trimmedOutput.ToString());
  }
}

Now, as the input comes in one character at a time, the trailing newlines are removed but separating newlines are kept. Also, you may be interested to know that the GetTextControl function builds a new read-only TextBox control on the fly to host the string content. This is so that the text can be selected, which isn’t possible when you set the content directly.

Right now, there’s no support for Console.Read, since I don’t really know how I want that to happen yet. Pop-up a dialog box? Something else?

Completions, Syntax Highlighting and Auto-indent

I was a few hundred lines into implementing completions using Roslyn with the help of the Roslyn team when I realized two things:

  1. Implementing completions to mimic the VS editor is hard.
  2. Completions aren’t enough – I really want an entire C# editor with completions, syntax highlighting and auto-indentation.

Maybe a future release of Roslyn will fix one or both of these issues, but for now, both are out of scope for my little REPL project.

0 comments




Moving to the Cloud Part 2: Mostly Sunny

In part 1 of this now multi-part series (who knew?), I discussed my initial attempts at moving my digital life into the cloud, including files, music, photos, notes, task lists, mail, contacts, calendar and PC games.There were some issues, however, and some things that I forgot, so we have part 2.

Before we get to that, however, it’s interesting (for me, at least) to think about why it’s important to be able to move things into the cloud. Lots of vendors are busy making this possible, but why? There are backup reasons, of course, so that a fire or other natural disaster doesn’t wipe out all of the family pictures. There are also the ease of sharing, since email makes a very poor file sharing system. Also, multi-device access is certainly useful, since the world has moved into a heterogeneous OS world again as smartphones and tablets take their place at the table with PCs.

For me, however, moving my data into the cloud is about freedom.

The cloud enables me to get myself bootstrapped with data associated with my personal or business life, using whatever device or OS I feel like using that day. It provides me freedom of location or vendor.

The cloud is still forming, however, so hasn’t really been able to make this a seamless experience, which is why I’m onto part 2 of this series.

Mail, Contacts and Calendar

Hotmail is a fine system for online access to mail, contacts and calendar that integrates well with Windows Phone 7. However, the integration with desktop Outlook and my custom domain isn’t good enough yet to rely on. The primary problem was the Hotmail Outlook Connector, which isn’t ready yet for prime time. It worked great with calendar and contacts, but fell down badly when it came to large email folders that I moved from my PST file. It never showed the sync’ing progress as complete, which made me uncomfortable that it never actually completed sync’ing and therefore my data wasn’t safe. Also, when I sent an email from Hotmail, either via the web or via Outlook, it showed the reply address as hotmail_44fe54cff788bdde@live.com. I assume the latter would’ve been fixed with Windows Live custom domains, but the former was the real deal-killer for me.

Also, I heard that Google Apps is the way to go, but that also requires some special software to enable sync’ing with desktop Outlook – I wanted something that was native to both Outlook 2010 and Windows Phone 7. Further, it cost money, so if I was going to pay, I wanted something that Microsoft was going to integrate well with.

So, I bit the bullet and hooked myself with the latest in hosted Exchange – Microsoft Office 365. That’s what I’m using now and just like the on-premise Exchange that worked great for me as a Microsoft employee, I’ve been very happy with it. However, because of the way I was using it, it was a pain to configure properly for use in hosting my csells@sellsbrothers.com email.

The easy way to configure Office 365 is to let it be the DNS name manager, which lets it manage everything for you, including your web site (via SharePoint), your mail, your Lync settings and any future service they care to tack on. However, that doesn’t work for me, since I didn’t want to move my 16-year-old web site into SharePoint (duh). Instead, I wanted to leave my DNS name manager at securewebs.com, which has been a fabulous web hosting ISP for me.

A slightly harder way to configure Office 365 for use with your domain is to only be used for selective services, e.g. set the MX record for mail, but don’t mess with the CNAME record for your web site. This would’ve been nice, too, except I don’t want to move all of the email accounts on sellsbrothers.com – only csells. Why? Well, that’s a family matter.

Over the years at family gatherings, to seem geek cool, I’ve offered free email boxes to my relatives. “Oh? You’re moving to another ISP again? Why don’t you move your email to sellsbrothers.com and then you can keep the same email address forever! And the best part is that it’s free!”

Now, of course, I’d recommend folks get an email address on hotmail or gmail, but this all started before the email storage wars back when you needed an actual invitation to set up a gmail.com account. Now I’ve got half a dozen family members with “permanent” and “free” email boxes and I don’t want to a) move them, b) charge them or c) pay for them myself on Office 365.

As cheap as you might think I am, it’s really migration that I worry most about – having successfully gotten them set up on their phones and PCs with their current email host, I don’t want to do that again for Outlook or migrate their email. Maybe it’s easy, maybe it’s hard. We’ll never know ‘cuz I’m not doing it!

So now, I have to make csells@sellsbrothers.com sync with Office 365 and leave everyone else alone. This is the hardest way to use Office 365 and involved the following:

Obviously, this is a crappy configuration experience, but no amount of manual updates to Outlook provided by the Office 365 site seemed to help. It was nice that the WP7 Outlook was much easier, although I’d really loved to have just told desktop Outlook that I was an Office 365 user and had it figure out all the touchy config settings.

Everything seems solid except one minor annoyance: when I do a Reply All, csells@sellsbrothers.com stays in the list because my mail programs don’t know that my csells@sellsbrothers.com and csells@sellsbrothers.onmicrosoft.com email addresses are logically the same. I assume if I was hosting my MX records at Office 365, this problem, along with the crappy config experience, would go away.

The good news is that I’ve got access to my full range of Mail, Contacts and Calendar from the web, my phone and my desktop, including multi-GB email folders I’ve copied over from my PST file, all for $6/month. Had I to do it over again, I’d have long ago moved my family to hotmail and avoided the config nightmare. I may yet do just that.

Encrypted Files

With my mail et al sorted, my next fix from last time was the lack of confidence in my most sensitive files with Dropbox. Dropbox can be hacked or subpoenaed like anyone else, so I want a client-side encryption solution. Dropbox may someday provide this themselves, but currently they gain a great deal of storage savings by detecting duplicate blocks amongst their users, saving significantly on uploads and storage, which client-side encryption disrupts. In the meantime, I really want an app that encrypts on the client and drops my data into Dropbox, which BoxCryptor does nicely.

In addition to supporting Windows, BoxCryptor also supports MacOS, iOS and Android, although not WP7 yet. Further, it’s free for 2GB and only a $40 one-time fee for unlimited data, so it’s cheap, too.

I also looked at SecretSync, which has a similar cross-platform story and pricing model (although it’s $40/year instead of $40/once), but it requires Java and I don’t put that on my box. For an open source solution, you may be interested in TrueCrypt.

imageFinancial Data

I’m a mint.com user. I like the idea of an all-up look at my finances across 29 online financial accounts. However, as a backup of that data, I wrote a mint.com scraping tool that downloads the CSV transactions export file and digs current snapshot data out of the homepage HTML. The format on the web site is constantly changing of course, so it’s a support problem, but having that data even a little messed up over time is way better than not having it at all, so I’m happy. The data itself goes into CSV files that I can query with LINQPad and that are stored in my Dropbox folder, which keeps them sync’d.

Books and Bookmarks

I can’t believe I missed this last time, but one of the big things I keep in the in the cloud is my set of Amazon Kindle books. I think that the proprietary format and DRM of Kindle materials will eventually open up because of competition, but until then, Amazon has been a great steward of my online books and bookmarks, providing me clients for all new platforms as well as their own e-ink-based hardware. I have an extensive book collection (this is just part of it), but am adding to the physical part of it no more.

Further, in the case that I have the “what the hell was that book I used to have?” moment after I finally truck all of my books off to Powell’s, the Brothers Sells have scanned all of the ISBN numbers from my 500+ books into LibraryThing. I won’t have the books anymore, but at least I’ll be able to browse, refresh my memory and add the books to Kindle on demand. The reason I picked LibraryThing is because it was easy to get all of the book metadata from just an ISBN (so it’s easy to spot data entry errors early), it’s easy to export from in a CSV file and, should I decide, easy to user their API.

App Specifics

In addition to the big categories, several apps keep data important to me:

Things Left On The Ground

As you may have surmised, I don’t put a lot of sentimental value in physical things. They’re not nearly as important to me as people, experiences or data. However, there are some things that I’d want to rescue in case of disaster given the chance:

As hard as I try, I can’t think of anything else. Should I have to jam, my plan is to place these few items into safe keeping and sell, donate and/or toss the rest.

Where Are We?

As I write this, I’m sitting in a Starbucks in Sandy, OR, 20 minutes from a cabin I’m renting for a few days. When I’m done here, I’ll explore the town, see a movie and make myself some dinner. I won’t worry about my phone, my laptop or my home being lost or destroyed, since 98% of the possessions I deem most valuable are being managed by cloud vendors I trust.

The cloud doesn’t just represent a place to backup or store data – it represents a way of life.

My data stores a lifetime of experiences, people and knowledge. By keeping it safe and available no matter where I go, I gain the freedom to wander, to experience new physical places and new hardware and software solutions, all without being unduly burdened.

Creative work requires a comfortable place to labor filled with the tools and the materials the worker needs to be creative. Today my tools are an Apple MacBook, Windows 7, Office 2010, Visual Studio 2010 and a Samsung Focus. Yesterday those tools were different and I’m sure they’ll be different again tomorrow. However, while other people build up their place with comfortable things around them – a bookshelf for reference, a comfy chair, knick-knack reminders of events or trips – my place is a lifetime of data and anywhere that provides access to electrons and bits.

Having my data safe, secure and available makes me feel comfortable, creative and free.

0 comments




Sells Manor: Running 64-bit Win8 on My MacBook Air

With the exception of //build/, I haven’t really been a public part of the Microsoft developer community for about a year. So, to make up for some lost time, I’m giving a talk about some of the //build/ bits at the Portland Area .NET User Group first thing in the new year. This means that I need a running installation of the Windows 8 Developer Preview on my new laptop, ‘cuz THE MAN took my old laptop back when I handed in my badge (although, to be fair, they paid for it in the first place : ).

My constraints were as follows:

So, with all of that in mind, of course I started with Hanselman’s Guide to Installing and Booting Windows 8 Developer Preview off a VHD post. If you’re willing to build a new VHD, that’s the way to go. However, I was able to use the techniques I learned from that post, especially the comment section and a couple tips from my friend Brian Randall to make my existing Win8 VHD work. Some of this may work for you even if you don’t have a MacBook Air.

Getting Windows 7 Running on my MacBook Air

I started with a virginal MacBook and used the built in Boot Camp to create a Win7 partition, point to a Win7 Ultimate ISO I have on a network share for just these kinds of emergencies and get it installed and running. It wasn’t seamless, but Bing was helpful here to straighten out the curves.

Replacing the Boot Manager

The way I like to create VHDs is via Windows Server 2008 and Hyper-V. Once I have the VHD, I drop it onto the c:\vhd folder on my computer, do a little bcdedit magic and boom: when I reboot, I’ve got a new entry from which to choose my OS of the moment.

However, Win8 doesn’t boot from the Win7 boot manager, so the first thing I needed to do (as implied by the comments in Scott’s post) was use bcdboot to replace the Win7 book manager with the Win8 boot manager. To do that, boot into Win7 and fire up the Disk Management tool (Start | Run: diskmgmt.msc). Select your BOOTCAMP drive and choose Action | Attach VHD. Choose the path to your VHD and you’ll get another virtual disk:

image

In my case, C was my Win7 Boot Camp HD and F was my Win8 VHD. Now, start an elevated command prompt and use bcdboot to replace the Win7 boot manager with the Win8 book manager.

DISCLAIMER: I’m stealing the “works on my machine” graphic from Hanselman’s site because this action replaces a shipping, maintained, supported boot manager with one that is still in “developer preview” mode. Make sure you have your computer backed up before you do this. I am a trained professional. Do not attempt this at home. All stunts performed on a closed course. Some assembly required. Void where prohibited. I’m just sayin’.

image

Now that you’ve got the right boot manager in place, getting Win8 to boot requires bcdedit.

Getting Windows 8 to Boot

Scott’s post on booting to VHD involves bcdedit, which describes adding a new boot option in the “Setting up your Windows Boot Menu to boot to an Existing VHD” section:

image

Use bcdedit to point to the Win8 VHD.

Logging into Windows 8 on a MacBook

Now when you boot your MacBook, you’ll choose to boot to your Windows partition as you always have (which should just happen automatically), but then the Win8 book manager will kick in and you choose your Windows 7 install or your new Windows 8 install. Booting into Windows 8 shows you the login screen as normal, but now you have another problem.

The MacBook keyboard comes without a Windows Delete button. Oh sure, it’s labeled “delete” in trendy lowercase letters, but it’s really the equivalent of the Windows Backspace button. And that’s a problem, because you need to press Ctrl+Alt+Del to log into Win8.

Of course, Apple thought of that, so they created the Boot Camp drivers for Windows that maps fn+delete to Delete, but you can only install them after you’ve logged in.

So how do you log into a MacBook without a Delete button? Easy. You attach an external USB keyboard, press that three-fingered salute and login as normal.

Once you’re in that first time, you can install the Boot Camp drivers and never have to use the external keyboard again.

Installing the Boot Camp Drivers on Win8

When I created the Boot Camp USB to install Win7, it came with a set of drivers in the WindowsSupport folder with a wonderful setup.exe that makes Windows run great on the MacBook. Unfortunately, when you try to run it, you get a message that says you can’t:

Untitled

If you search the internet, you can find folks that have gotten past this by tricking setup.exe into thinking it’s running on Win7, but you’ll also find that those tricks don’t seem to work for 64-bit installs on MacBook Air, i.e. the one I was doing. However, this is where Brian had another suggestion: you can edit the Boot Camp MSI itself.

DISCLAIMER: This is something that I made work surprising well on my own personal MacBook Air, but I provide no guarantee that it won’t cause your computer to burst into flames on an international flight causing your body to be lost at sea. These techniques are not supported by Microsoft, Apple or the American Dental Association. You’ve been warned.

You may wonder, “To what MSI is Mr. Sells referring?” And I answer: WindowsSupport\Drivers\Apple\BootCamp64.msi. This is the 64-bit MSI with the check in it for Windows 7. To make it work for Windows 8, you need to edit the MSI and change the version number. And to do that, the easiest tool I know of is the unsupported, discontinued Orca MSI editor from Microsoft, now hosted on technipages.com. Running Orca allows you to edit BootCamp64.msi and change the Windows version part of the LaunchCondition from 601 (Windows 7) to 602 (Windows 8):

orca

Once you’ve changed this version, WindowsSupport\setup.exe seems to run just fine, installing the keyboard entries that allow you to login and the control panel that allows you to customize everything.

Where Are We?

Starting from a Boot Camp installation of Windows 7 on my MacBook Air, I showed you how I was able to get Windows 8 booting from a VHD. It wasn’t pretty and it required tips from all over the internet. I gather them here today so that future anthropologists will know how hard we worked to enable the coming of our robotic overlords. If you’re able to use these instructions to expedite their arrival, I’m sure they’ll take that into consideration when they’re sorting us into work details.

P.S. This post is dedicated to Jerry Pournelle. I used to pour over his Byte magazine column every month like he was the computer Sherlock Holmes.

0 comments




Moving My Data To The Cloud: Stormy Weather

For years, I’ve maintained a single “main” computer. It was the computer that was the central authority of all of the personal data I’d accumulated over the years and from which it made me uncomfortable to be separated. Because I needed a single computer for everything, it had to work on my couch, on a plane, on a desk and everywhere else I ever needed to go. Also, it couldn’t have a giant monitor or multiple monitors, because it had to go everywhere. All of this was because I needed all of my data with me all of the time.

My process for moving to a new computer used to include a lot of manual copying of files from the old D hard drive (D is for Data) to my new hard drive, which was also carefully partitioned into C for Windows, Office, Visual Studio, etc. and D for a lifetime of books and articles, coding projects and utilities I’ve collected over the years, e.g. LinqPad, Reflector, WinMerge, etc. This is 30GB of stuff I wanted access to at all times. I was also backing up via Windows Home Server, keeping photos and music on the WHS box (another 30GB), then backing that up to the cloud via KeepVault. And finally, as I upgraded HDs to go bigger or go to solid state, I kept each old HD around as another redundant backup.

All of that gave me some confidence that I was actually keeping my data safe right up until my Windows Home Server crashed the system HD and I found out that the redundancy of WHS doesn’t quite work the way you’d like (this was before I installed KeepVault). This was a first generation HP Home Server box and when it went down, I took it apart so I could attach a monitor, keyboard and mouse to diagnose it, pulled the HDs out so I could read what files I could and ultimately had to drop it off in Redmond with the WHS team so I could get it up and running again.

There are some files I never got back.

KeepVault gave me back some of the confidence I’d had before WHS crashed, but they didn’t provide me a way to see what files they were backing up, so I didn’t have the transparency I wanted to be confident. Further, they don’t have clients on every kind of platform like Dropbox does.

Of course, simply sync’ing files isn’t enough – sync’ing my 10GB Outlook PST file every time I got a new email was not a good way to share 20 years of contacts, email and calendar items.

The trick is to sync each kind of data in the right way, be confident that it’s safe and have access to it across the various platforms I use: Windows, Windows Phone 7, iOS and possibly Android (you know, if I feel like walking on the wild side!). And since I’m currently under employed (my new gig doesn’t start till the new year), I figured I’d do it once and do it right. I almost got there.

Files

Let’s start easy: files. Dropbox has made this a no-brainer. You install the software on any platform you care to use, drop everything you want into the folder and it just works, keeping files in sync on the cloud and across platforms, giving you adequate (although not great) status as it does so. Most platforms are supported natively, but even on platforms that aren’t, there are often alternative clients, e.g. I’m using Boxfiles for Windows Phone 7. When I gave up my Microsoft laptop, instead of doing the dance of the copy fairy to my new Mac Book Air, I installed Dropbox on both computers, dropped everything I want backed up and sync’d between computers into the Dropbox folder. 36 hours and 30GB later, all of it was copied into the cloud and onto my new laptop, at which point I reformatted my Microsoft laptop and handed it into my boss.

Further, as a replacement for WHS and KeepVault, I now keep all of the files that I was keeping just on my WHS server – photos and music primarily – into Dropbox.

image

This keeps me the confidence I need to know that my files are safe and backed up to the cloud, while making it very easy to keep it backed up locally by simply running Dropbox on more than one computer at my house. If at any time, I don’t want those files on any one computer, I tell Dropbox to stop sync’ing those folders, delete the local cache and I’m all done.

There are two tricks that I used to really make Dropbox sing for me. The first is to change my life: I no longer partition my HDs into C and D. The reason I’d always done that was so that I could repave my C with a fresh Windows, Office and VS install every six months w/o having to recopy all my data. Windows 7 makes this largely unnecessary anyway (bit rot is way down on Win7), but now it doesn’t matter – I can blow any computer away at will now, knowing that Dropbox has my back. In fact, Dropbox is my new D drive, but it’s better than that because it’s dynamic. The C drive is my one pool of space instead of having to guess ahead of time how to split the space between C and D.

The other thing I did was embrace my previous life: I wanted to keep D:\ at my fingertips as my logical “Data” drive. Luckily, Windows provides the “subst” command to do just that. Further, ntwind software provides the fabulous VSubst utility to do the mapping and keep it between reboots:

image

Now, I’ve got all the convenience of a dedicated “data” drive backed up to the cloud and sync’d between computers. Because I needed 60GB to start, I’m paying $200/year to Dropbox for their 100GB plan. This is more expensive than I’d like, but worth it to me for the data I’m storing.

There is a hitch in this story, however. Right now on Dropbox, data and metadata is available to Dropbox employees and therefore to anyone that hacks Dropbox (like the government). I don’t like that and for my very most sensitive data, I keep it off of Dropbox. When Dropbox employees themselves aren’t able to read Dropbox data or metadata, then I’ll move the sensitive data there, too.

Music

I’m not actually very happy with how I’m storing music. I can play all my music on any PC, but I can only play it one song at a time on my WP7 because there’s no Dropbox music client. I could use the Amazon cloud drive that provides unlimited music storage for $20/year, but there’s no WP7 client for that, either. Or I could spend $100/year on Amazon and get my 100GB of storage, but their client isn’t as widely available as Dropbox. Ironically, Dropbox is using Amazon as their backend, so hopefully increased pressure in this space will drop Dropbox’s prices over time.

Photos

I’m not using Facebook or Flicr for my photos simply because I’m lazy. It’s very easy to copy a bunch of files into Dropbox and have the sync’ing just happen. I don’t want to futz with the Facebook and Flickr web interfaces for 15GB worth of photos. Right now, this is the digital equivalent of a shoebox full of 8x10s, but at least I’ve got it all if the house burns down.

imageNotes and Tasklist

For general, freeform notes, I moved away from Evernote when they took the search hotkey away on the Windows client (no Ctrl+F? really?) and went to OneNote. The web client sucks, but it’s better than nothing and the Windows and WP7 clients rock. I have a few notes pinned to my WP7 home screen that I use for groceries, tasks, etc., and I have all of my favorite recipes in there, too, along with my relatives’ wi-fi passwords that they don’t remember themselves, a recording of my son snoring, etc. It’s a fabulous way to keep track of random data across platforms.

On the task list side, I only sorta use OneNote for that. I also send myself emails and write little TODO.txt files every time I get a little bee in my bonnet. I’ve never found that the Exchange tasks sync well enough between platforms to invest in them. Maybe someday.

imageMail, Contacts and Calendar

And speaking of Exchange, that’s a piece of software that Microsoft spoiled me on thoroughly. This is sync that works very well for contacts, emails and calendar items. IMAP does email folders, but server implementations are spotty. For years, I used Exchange for my personal contacts and calendar, only keeping my personal email separate in a giant PST file, pulling it down via POP3. This can sorta be made to work, but what I really wanted was hosted Exchange.

However, what I found cost between $5 and $11 a month per user. I’d probably have gone with Office 365 for sellsbrothers.com mail, even at $5/month except for two reasons. The first is that Microsoft requires you to move your entire DNS record to them, not just the MX record, which means there is all kinds of hassle getting sellsbrothers.com working again. They do this so that they can get all of the DNS records working easily for Lync, Sharepoint, etc., but I don’t want those things, so it’s just a PITA for me. If they change this, I’d probably move except for the other problem: I’m not the only user on sellsbrothers.com.

For years to be the big shot at family gatherings, I’ve been offering up permanent, free email addresses on my domain. That’s all well and good, but now to maintain my geek cred, I need to keep my mom, my step-mom, my brother, my sons, etc., in an email server that works and one that they don’t have to pay for. So, while I was willing to pay $5/month for hosted exchange for me, I wasn’t willing to pay it for my relatives, too!

One option I tried was asking securewebs.com (my rocking ISP!) to upgrade to SmarterMail 8.x, but that didn’t work. I even footed the one-time fee of $200 for the ActiveSync support for SmarterMail, but I couldn’t make that sync from Outlook on the desktop or the phone either.

Eventually I made an imperfect solution work: Hotmail. The nice thing about Hotmail is that it’s free for 25GB (yay webmail storage wars!) and it syncs contacts, mail and calendar items just like I want. Further, with some effort (vague error messages are not useful!), I was able to get Hotmail to pull in my personal email. And, after installing the Outlook Hotmail Connector (explicitly necessary because my Windows Live ID is not a @live.com or an @hotmail.com email address), I was able to sync almost everything, including the folders I copied from my giant PST file, via hotmail to both my desktop and phone Outlook. However, there are a few downsides:

The good news is that this all works for free and my relatives continue to have working email. The bad news is that it doesn’t work nearly as well as the Exchange server I’m used to. Hopefully I will be able to revisit this in the future and get it working correctly.

imagePC Games

I purchase all of my games via Steam now and install them as the mood strikes me. I love being able to reinstall Half-Life 2 or Portal on demand, then blow it away again when I need the hard drive space. Steam is the only viable app store for Windows right now, although I am looking forward to have the Microsoft app store in Windows 8.

imageBackups

I no longer maintain “backups” in the sense that I can slap in a new HD, boot from a USB stick and have my computer restored in 30 minutes or less (that never worked between WHS and Dell laptops anyway). I’ve had HD problems, of course, but they’re so rare that I no longer care about that scenario. Instead, what I do is keep all of the software that I normally install on a file server (the new job of my WHS box). If the file server goes down, then most of the software I install, i.e. Windows 7, Office and Visual Studio, is available for download via an MSDN Subscription. The rest is easily available from the internet (including Telerik tools and controls!) and I just install it as I need it.

Where Are We?

In order to free myself from any specific PC, I needed to pick a new centralized authority for my data: the cloud. The experience I was after for my PCs was the same one I already have on my phone – if I lose it, I can easily buy a new one, install the apps on demand and connect to the data I already had in Exchange, Hotmail, Skydrive, etc. Now that I’ve moved the rest of my world to Dropbox, I can treat my PCs and tablets like phones, i.e. easily replaceable. It’s not a perfect experience yet, but it’s leaps and bounds ahead of where it was even a few years ago.

Hardware and software comes and goes; data is forever.

0 comments




Goodbye Microsoft, Hello Telerik!

I have gotten to do a ton of really great things at Microsoft:

Those and dozens more have all been extraordinary experiences that have made my time at Microsoft extremely valuable. But, like all good things, that time has come to an end.

telerikLogo-web-450x180pxAnd now I’m very much looking forward to my new job at Telerik!

Telerik is an award-winning developer tools, UI controls and content management tools company. They’re well-known in the community not only for their top-notch tools and controls, but also for their sponsorship of community events and their free and open source projects. Telerik is a company that cares about making developer’s lives better and I’m honored that they chose me as part of their management overhead. : )

My division will be responsible for a number of UI control sets – including WinForms, WPF, Silverlight and ASP.NET – as well as a number of tools – including the Just line, OpenAccess ORM and Telerik Reporting. I’m already familiar with Telerik’s famous controls and am now ramping up on the tools (I have been coding with JustCode recently and I like it). My team is responsible for making sure that developers can make the most of existing platforms, knowing that when you’re ready for the next platform, we’ll be there ready for you.

These controls are already great (as is the customer support – holy cow!), so it’ll be my job to help figure out how we should think about new platforms (like Windows 8) and about new directions.

And if you’ve read this far, I’m going to ask for your help.

I’m going to be speaking at user groups and conferences and blogging and in general interacting with the community at lot more than I’ve gotten to do over the last 12 months. As I do that, please let me know what you like about Telerik’s products and what you don’t like, what we should do more of and what new things we should be doing. Telerik already has forums, online customer support, blog posts and voting – you should keep using those. In addition:

Feel free to reach out to me directly about Telerik products.

Of course, I can’t guarantee that I’ll take every idea, but I can guarantee that I’ll consider every one of them that I think will improve the developer experience. I got some really good advice when I first arrived at Microsoft: “Make sure that you have an agenda.” The idea is that it’s very easy to get sucked into Microsoft and forget why you’re there or what you care about. My agenda then and now is the same:

Make developers’ lives better.

That’s what I tried to do at Intel, DevelopMentor and Microsoft and that’s what I’m going to try to do at Telerik. Thanks, Telerik for giving me a new home; I can’t wait to be there.

0 comments




Enabling the Tip Calculator in Your Brain

I can’t imagine anyone reading this blog needs to read this, but I can’t help myself.

When I was just a wee lad, probably the most valuable thing I learned was how to perform mathematical estimation, the importance of which and several techniques you can get by reading Jon Bentley’s The Back of the Envelope (this essay along with several others, are collected in his most excellent books Programming Pearls and More Programming Pearls, both of which are still relevant a decade later). Not only is estimation generally quicker than running a calculator, but even when you do run a calculator, it helps you figure out when you did it wrong, the latter of which has saved my bacon time and again.

For example, as much as I love the Windows Phone 7 marketplace and it’s quality and quantity of applications, the ones that puzzle me are the “tip calculator” apps (several!). I don’t understand why it’s worth the trouble of pulling out your phone and punching buttons when you can know the tip instantly.

For example, let’s assume the dinner bill is $37.42. If the service was bad, that’s a 10% tip (you have to tip them something ‘cuz the IRS assumes you will and taxes them accordingly – bastards). So, with a 10% tip, take the bill and move it right one decimal point: $3.74. Now, round up or down depending on how bad the service was, e.g. $3.50 or $4. Quick and easy.

Assuming the service was great, that’s a 20% tip, so double the bill and move it right one decimal point, making the math easier for yourself, e.g. $37.42 is close to $35, doubling is $70, so a $7 tip. Boom: 20% tip.

If you want to get fancy and provide a 15% tip for good but not great, then average the two numbers: ($4 + $7)/2 = $5.50. Zim zam zoom.

Honestly, as great as the apps are on your phone, tablet or BlueTooth headset (seriously), think about using the apps in your head first. Now only are they quicker and cheaper, but using them staves off dementia (which is a good thing!).

Oh, and if the tip is added as a mandatory minimum, then the additional tip is easy: $0.00. I don’t deal well with authority.

0 comments




A Function That Forces

Far Side - Midvale School for the GiftedAt Microsoft, there’s this passive-aggressive cultural thing called a “forcing function,” which, to put it crudely, is an engineering way for us to control the behavior of others. The idea is that you set up something to happen, like a meeting or an event, that will “force” a person or group to do something that you want them to do.

For example, if someone won’t answer your email, you can set up a meeting on their calendar. Since Microsoft is a meeting-oriented culture (even though we all hate them), a ‘softie will be very reticent to decline your meeting request. So, they have a choice – they can attend your meeting so that they can answer your question in person or they can answer your email and get that time back in their lives. This kind of forcing function can take larger forms as well. I can’t say that our execs make the decision like this (since they don’t talk to me : ), but it is the case that signing up a large number of Microsoft employees to host and speak at important industry events does have the effect of making us get together to ensure that our technologies and our descriptions of those technologies holds together (well, holds together better than they would otherwise : ).

Unfortunately, this way of thinking has become so much a part of me that I’ve started to use it on my family (which they very much do not like). Worse, I use it on myself.

For example, I have been holding back on half a dozen or more blog posts until I have the software set up on my newly minted web site to handle blog posts in a modern way, namely via Windows Live Writer. In other words, I was using the pressure inherent in the build up of blogging topics to motivate me to build the support I wanted into sellsbrothers.com to have a secure blogging endpoint for WLW. Before I moved all my content into a database, I could just pull up FrontPage/Expression Web and type into static HTML. Now that everything is data-driven, however, the content for my posts are just rows in a database. As much as I love SQL Server Management Studio, it doesn’t yet have HTML editing support that I consider adequate. Further, getting images into my database was very definitely a programming task not handled by existing tools that I was familiar with.

So, this is the first post using my new WLW support and I’m damn proud of it. It was work that I did with Kent Sharkey, a close friend of mine that most resembles Eeyore in temperament and facial expressions, and that just made it all the more fun!

Anyway, I’m happy with the results of my forcing function and I’ll post the code and all the details ASAP, but I just wanted to apologize for my relative silence on this blog and that things should get better RSN. XXOO.

P.S. I’m loving Windows Live Writer 11!

0 comments




Why can't it all just be messages?

My mobile device is driving me crazy.

I have an iPhone 4.0. Normally when it's driving me crazy, it's standard stuff like the battery life sucks or that the iOS 4.0.1 update didn't fix the proximity detection or stop emails I send via Exchange from just disappearing into the ether.

This time, it's something else and I can't blame the iPhone; I think all modern smart phones have the same problem. The problem is that I constantly have to switch between apps to see my messages. Here are screenshots for 5 of the messaging clients I use reguarly:

Voicemail Exchange Email SMS/MMS Facebook Twitter

This list doesn't include real-time messages like IM, or notifications like Twitter or RSS. I'm just talking about plain ol' async messaging. We used to think of it as "email," but really voicemail, email, SMS, MMS, Facebook messages and Twitter Direct Messages are all the same -- they are meant to queue up until you get to them.

Now, some folks would argue that SMS/MMS aren't meant to be queued; they're meant to be seen and handled immediately. Personally, I find it annoying that there is a pop-up for every single text or media messages I get on my phone and there seems to be no way to turn that off. On the other hand, if I want that to happen for other types of messages, e.g. voicemail, I can find no way to turn it on even if I want to. Why are text messages special, especially since most mobile clients let you get over the 160 character limit and will just send them one after the other for you anyway?

iOS 4 takes a step in the right direction with the "universal" inbox:

iOS4 "universal" inbox

Here I've got a great UI for getting to all my email messages at once, but why can't it handle all my messages instead?

super-universal inbox

Not only would this put all my messages in one place at one time, but it would unify up the UI and preferences across the various messaging sources. Do you want your text messages to quietly queue up like email? Done. Do you want your voicemail to pop to the front like an SMS? Done. Do you want the same swipe-to-delete gestures on your voicemail as you have with your email? Done!

Maybe someone with some experience on the new Windows Phone 7 can tell me that there is a "messaging" hub that manages all this for me. Since they're already doing things like bringing facebook pictures into the "pictures" hub (or whatever they call it), that doesn't seem completely out of the realm of possibility. If that's the case, I'll say again what I've been saying for a while -- I can't wait for my Windows Phone 7!

0 comments




College info for my sophomore

I went to a college planning sessions at my sons' high school not because I'm hung up on getting my Sophomore into a top school, but because I thought I'd get a jump on things. I learned I was actually behind.

For one, I learned that the high school has an online system that will do some amazing things:

That means that my son can answer questions about personality and interests and draw a straight line through to what he needs to do to get into a school so he can learn to do the jobs he'll like and be good at. Holy cow. We didn't have anything like that when I was a kid.

Further, the online system has two complete SAT and ACT tests in it, so, along with the PSAT that he's already taking, he can do a practice ACT, figure out which test he's best at (my 34 ACT score was way better than my 1240 SATs) and just take that test, since most schools these days take both SAT or ACT results.

This is all freely provided by the high school and, in fact, they have counseling sessions with the students at each grade level for them to get the most from this system.

It's no wonder that 93% of students from this high school go on to 4 or 2-year college degree programs.

That was the good part.

The scary part is that my eldest, half way through his Sophomore year, is essentially half-way through his high school career. Colleges only see their grades through the end of Junior year, since most college applications are due in the middle of January of their Senior year at the latest. I have to sit down with my son and have the conversation about how "even if you get a 4.0 from now on, the best grades you can have are..."

Is it just me or is the world moving faster with each passing day?

0 comments




you may experience some technical difficulties

I've been futzing with the site and I've got more to do, so unexpected things may happen. Last weekend I screwed with the RSS generator and that caused a bunch of folks to see RSS entries again. This weekend I'm moving more of my static content into the database, so you may see a bunch of old stuff pop up.

Feel free to drop me a line if you see anything you think needs fixing. Thanks for your patience.

0 comments




On Building a Data-Driven E-Commerce Site

The following is a preprint of an article for the NDC Magazine to be published in Apri.

 

It had been a long, hard week at work. I had my feet up when a friend called and popped the question: “Do you know how to build web sites?”

 

That was about a month ago and, after swearing to her that I spent my days helping other people build their web sites, so I should oughta know a thing or two about how to build one for her. After some very gentle requirements gathering (you don’t want a bad customer experience with a friend!), I set about designing and building bestcakebites.com, a real-world e-commerce site.

 

 

She didn’t need a ton of features, just some standard stuff:

·        Built on a technology I already knew so I could have the control I needed.

·        Listing a dozen or so products with pictures and descriptions.

·        A shopping cart along with, ideally, an account system so folks could check their order status or reorder easily.

·        Shipping, handling and tax calculation.

·        Taking payment.

·        Sending email notifications of successful orders to both the customer and the proprietor.

 

As it turns out, there are a bunch of ways to skin this particular cat, but because I was a busy fellow with a more-than-full-time job and a book I’m supposed to be writing, instead of falling prey to my engineering instinct to write my own website from scratch, I decided to see what was out there.

 

As it turns out, there’s quite a few e-commerce web site solutions in the world, several of them recommended by PayPal, as well as one that PayPal itself provides, if you don’t mind sending shoppers to their web site. And if fact, I did. Requirement #1 was that I needed complete control over the code and the look and feel of the site. I didn’t want to configure somebody else’s web site and risk going off of her chosen domain name or not being able to tweak that one little thing that meant the difference between #succeed and #fail. (Friend customers are so picky!)

 

The e-commerce solution I picked was the one I found on http://asp.net (I am a Microsoft employee after all): nopCommerce. It’s an open source solution based on ASP.NET and CSS, which meant that I had complete control when it wasn’t perfect (control I used a few times). It was far more than full-featured enough, including not only a product database, a shopping cart, shipping calculation and payment support, but also categories and manufacturers, blogs, news and forums, which I turned off to keep the web site simple (and to keep the administration cost low). Unexpected features that we ended up liking included product variants (lemon cake bites in sets of 8, 16 and 24 made up three variants, each with their own price, but sharing a description), product reviews, ratings and site-wide search.

 

The real beauty of nopCommerce, and the thing that has been the biggest boon, was that the whole thing is data-driven from SQL Server. To get started, I ran the template web site that was provided, it detected that it had no database from which to work and created and configured the initial database for me, complete with sample data. Further, not only was it all data-driven based on the products, orders and customers the way you’d expect, but also on the settings for the web site behavior itself.

 

For example, to get shipping working, I chose from a number of built-in shipping mechanisms, e.g. free, flat rate, UPS, UPSP, FedEx, etc., and plugged in my shipper information (like the user name and password from my free usps.com shipping calculation web service account)

 

With this configuration in place, the next order the site took, it used that shipper, pulling in the shipping information from the set of size and weight measurements on the ordered products (from the database), calling the web service as it was configured (also from the database) to pull in the set of shipping options from that shipper, e.g. Express Mail, Priority Mail, etc., augmenting the shipping prices with the per product handling changes, and letting the user pick the one they wanted. All I had to do was use the administration console, tag each product with size information and tell nopCommerce that I’d like USPS today, please.

 

Everything worked this way, including tax calculation, payment options (we chose PayPal Direct and Express so that folks with a credit card never left our site whereas folks with PayPal logged into their account on the familiar paypal.com), localization, whether to enable blogs, news, forums, etc. Most of the time when I wanted to make a change, it was just a matter of flipping the right switch in the database and not touching the code at all.

 

As one extreme example of where the data-driven nature really came through was on the order numbers generated by the site. During testing, I noticed that our order numbers were monotonically increasing from 1. Having ordered from a competitor’s site, their order number was only 103, clearly showing off what amateurs they were (and the order itself took a month to arrive after two pestering emails, so it was clear how amateur they really were). I didn’t want us to appear like newbies in our order-confirmation emails (which nopCommerce also generated for us), so I found the Nop_Order table, and used SQL to increase the identity column seed, which it was clear was the origin of the order number: 

DBCC CHECKIDENT (Nop_Order, RESEED, 10047)

From then on, every time an order came through, we protected experience simply because of the order number, which I changed without touching a line of code. If helping you “fake it ‘til you make it” isn’t enough reason to love a data-driven solution, I don’t know what is!

0 comments




The incomplete list of impolite WP7 dev requests

In my previous list of WP7 user requests, I piled all of my user hopes and dreams for my new WP7 phone (delivery date: who the hell knows) onto the universe as a way to make good things happen. And all that’s fine, but I’m not just a user; like most of my readers, I’m also a developer and have a need to control my phone with code. I have a long list of applications I want to write and an even longer list of applications I want other developers to write for me.

 

Today at 1:30p is the first public presentations of how to do WP7 programming, so to affect the future, I have to get my feature requests for the Windows Phone 7 Series development environment posted now!

 

·        I want the legendary Microsoft tools for code editing, UI design, debugging, deployment, version control, add-ins, project management, etc. Please just let me install the “Windows Phone 7 Series SDK” and have Dev10 light up with new project and project item templates et al.

·        I definitely want to be able to write C#. Since Charlie’s mentioned Silverlight, it seems like I’ll be able to do just that.

·        I want to be able to mark a program as useful for background processing, e.g. I’m writing Pandora, and let that trigger a question for the user as to whether to allow it or not, ideally with a message like “Pandora would like to continue to run in the background using XX% of your battery. Is that OK?”

·        For most apps that would like to appear as if they’re running in the background, I want to register a bit of code to run when its own cloud-based notifications comes in, e.g. an new IM or sports score.

·        I want to be able to access data from built-in apps, e.g. contacts, appointments, etc.

·        Obvious things:

o   Notification when the user switches orientation

o   Access to the compass, GPS, network, camera, mic, etc.

o   Access to the network using TCP, HTTP, Atom, AtomPub and OData.

o   Low and high-level access to gestures

o   A nice set of build-in standard controls, including simple things like text box and more complicated things like map and route

o   Integration with existing apps, e.g. the ability to launch the map app from my own app at a specific location or to a specific route.

o   Ability to create custom controls/do custom drawing.

o   Serialization so I can keep user data between sessions. Notifications when my app is being whacked.

o   App-selected keyboards for specific tasks, e.g. entering a URL.

 

That doesn’t seem like a very big list. I must be missing something. : )

0 comments




503 older posts       No newer posts