“Encourage” Visual Studio Extension

From : “Encourage” Visual Studio Extension | Coding4Fun Blog | Channel 9.

Since I’ve not been doing the Visual Studio Monday posts these past few weeks posts have been piling up.

There’s just too many cool things in that pile that I can’t help myself. So instead of “Windows Wednesday” it’s going to be a Haack Wednesday!

Phil Haack posted a pretty unusual Visual Studio extension, with source of course, that I hope will encourage (or not, you’ll see… lol) you too…

Encourage

Adds a bit of whimsy to your work day.

There are times when writing code is drudgery. In those dark times, bathed in the soft glow of your monitor, engrossed in the rhythmic ticky tacka sound of of your keyboard, a few kind words can make a big difference. And who better to give you those kind words than your partner in crime – your editor.

Encourage for Visual Studio is a whimsical extension to Visual Studio that adds just a little bit of encouragement throughout your day.

Every time you save your document, this extension gives you an unobtrusive bit of good cheer and encouragement.

image

The source code is available on GitHub.

Your Editor should Encourage You

There are times when writing code is drudgery. That love for code becomes obsession and leads to an unhealthy relationship. Or worse, there are times when the thrill is gone and the love is lost. You’re just going through the motions.

In those dark times, bathed in the soft glow of your monitor, engrossed in the rhythmic ticky tacka sound of of your keyboard, a few kind words can make a big difference. And who better to give you those kind words than your partner in crime – your editor.

With that, I give you ENCOURAGE. It’s a Visual Studio extension that provides a bit of encouragement every time you save your document. Couldn’t we all use a bit more whimsy in our work?

What?!

Yes, it’s silly. But try it out and tell me it doesn’t put an extra smile on your face during your day.

This wasn’t my idea. My co-worker Pat Nakajima came up with this idea and built a TextMate extension to do this. He showed it to me and I instantly fell in love. With the idea. And Pat, a little.

Get Involved!

As of today, this only supports Visual Studio 2013 because of my ineptitude and laziness. I welcome contributions to make it support more platforms.

Parting Thoughts

On the positive side, when you need a specific service, it’s nice to be able to slap an[Import] attribute and magically have the type available. The extensibility of Visual Studio appears to be nearly limitless.

On the downside, it’s ridiculously difficult to write extensions to do some basic tasks. Yes, a big part of it is the learning curve. But when you compare the Textmate example to what I had to do here, clearly there’s some middle ground here between simplicity and power.

… [Click through to read it]

Settings for your Visual Studio Extension

Recently I wrote what many consider to be the most important Visual Studio Extension ever shipped – Encourage for Visual Studio. It was my humble attempt to make a small corner of the world brighter with little encouragements as folks work in Visual Studio. You can get it via the Visual Studio Extension Manager.

But not everyone has a sunny disposition like I do. Some folks want to watch the world burn. What they want is Discouragements.

Well an idiot might write a whole other Visual Studio Extension with a set of discouragements. I may be many things, but I am no idiot. This problem is better solved by allowing users to configure the set of encouragements to be anything they want.

And that’s what I did. I added an Options pane to allow users to configure the set of encouragements. It turned out to be a more confusing ordeal than I expected. But with some help from Jared Parsons, I may now present to you, discouragements!

image

Encourage options

So if you’re of the masochistic inclination, you can treat yourself to custom discouragements all day long if you so choose.

Discouragement in use

Challenges and Travails

So why was this challenging? Well like many things with development platforms, to do the basic thing is really easy, but when you want to deviate, things become hard.

UIElementDialogPage

Thankfully, Jared pointed me to the UIElementDialogPage.

If you want to provide a WPF User Control for your Visual Studio Extension, derive from UIElementDialogPage and not DialogPage like all the samples demonstrate!

It does all the necessary WndProc magic under the hood for you. Note that it was introduced in Visual Studio 2012 so if you take a dependency on it, your extension won’t work in Visual Studio 2010. Live in the present I always say.

Storing Settings

The other thing I learned is that AppSettings is not the place to save your extension’s settings. As Jared explained,

… [Click through for the whole thing]

Upcoming Features Of C#

csharp-logoSome of you may have seen this already, and apologies if you have, but Mads Torgersen has give out a very useful link that he urged us to share with the wider community.

This link outlines some of the up coming changes to the C# Language. The document linked to talks in more detail about what you can expect, but for those that want a quick break down you will see things like:

  • Auto property enhancements
  • Primary constructors
  • Expression bodied function members
  • Initializers in structs
  • Using static
  • Exception filters
  • Declaration expressions
  • Null-condition operators
  • Index initializers
  • Wait in catch and finally blocks
  • Binary literal and digit seperators
  • Extension Add methods in collection initializers
  • Improved overload resolution

You can find out more by reading the spec : Upcoming Features In C#

via Upcoming Features Of C# | Sacha’s blog.

Du changement à la Communauté .NET Montréal

La Communauté .NET Montréal existe depuis plus de 20 ans.  Au fil du temps, ce groupe d’usagers à porté différents noms afin de refléter les technologies présentées lors de ses activités.  Nous sommes passé de  «Groupe d’usagers Visual Basic et C++ de Montréal » pour abandonner le C++ et devenir le « Groupe d’usagers Visual Basic de Montréal » et ensuite « Groupe d’usagers Visual Studio Montréal » lors de la sortie de .NET.  La dernière incarnation se veut un regroupement de différents groupes d’usagers utilisant une technologie commune d’où le nom actuel « Communauté .NET Montréal ».

Il est maintenant temps de refaire le focus, ou plutôt, d’élargir un peu les horizons.  Le fait d’avoir « .NET » dans le nom est un peu étrange pour les groupes SQL Server et architecture.  De plus, avec le développement Web, certains développeurs passent plus de temps en JavaScript qu’en .NET et que dire de Azure qui supporte des technologies autres que celle de Microsoft.  N’ayez pas crainte, nous n’allons pas nous mettre à couvrir Java quand même, le focus sera toujours le développement sur la plateforme Microsoft!  C’est donc pourquoi le nouveau nom sera « MS DEV MTL » que l’on pourra rejoindre à msdevmtl.com.

MS pour Microsoft, évidemment.  En utilisant ces deux lettres, on reconnait bien Microsoft mais on n’utilise pas une marque de commerce qui ne nous appartient pas.

DEV pour développeur, évidemment.  Le focus est et sera toujours le développement logiciel avec les outils et/ou la plateforme Microsoft.

MTL pour Montréal, évidemment.

Le tout forme un acronyme facile à comprendre et à retenir en français et même en anglais.

Nouveau nom veut aussi dire nouveau logo.  Nous avons lancé un concours sur 99Designs.com et avons sélectionné 8 finalistes mais c’est à vous de choisir le gagnant.  Pour ce faire, vous avez jusqu’au mercredi 30 juillet pour noter les finalistes à l’aide de ce sondage : https://99designs.ca/logo-design/vote-hrtw0m

Vous n’avez qu’à donner de 0 à 5 étoiles pour chaque design.  Notez les designs que vous aimez en accordant de 1 à 5 étoiles et de donnez aucune étoile à ceux que vous n’aimez pas. Vous pouvez aussi laisser facultativement un commentaire par design.

Dans les semaines suivantes, nous activerons le nom de domaine sur Meetup et changerons notre page Facebook et notre feed Twitter.  Nous vous informerons au fur et à mesure que les changements seront apportés.

Voici les designs finalistes.  Bon vote!

Technology Radar – July 2014 update

RadarIf you are like me, ready for new challenge, you may ask yourself what is the next big think to learn?

I just finished a 4 year assignment and I’m now free to start something new. My last job was mainly around desktop app using WPF and WCF. If I look around me, I see more web development than desktop.

The timing is good to look for something new because ThoughWorks just released their Technology Radar. Here are some of the think I find interesting for my field of expertise.

Techniques

Event Sourcing has moved from assess to trial. This is good because I think it’s the way to go. On my last project we replaced the big fat SQL Database by an Event Sourcing and it had a lot of benefits. First of all we were able to write all the business rule in one place : The Domain. We also were able to versioned our data. In the system I was working on we needed to have some kind of source control around the data. Event Sourcing made that possible.

REST without PUT is a logical choice for me. Because I like to work with CQRS and Event Sourcing, having only one way to change the domain is a good thing. PUT Implies you can change data in place where PUSH mean you always add new data, Therefore you don’t loose anything.

Platform

The only new thing I find really interesting for me in that field is EventStore for obvious reasons. Hadoop 2.0 there to stay apparently. Because it’s a big player in the Big Data world it’s good to take a look at it more closely.

Tools

I’m sad but not surprise to see TFS on the edge of the radar. The TFS work is not as productive as other source control system out there. Maybe that’s why Microsoft added a GIT integration with TFS to try to get more adoption from the community.

Roselyn is now visible in the radar. Microsoft has made a good job to build that new compiler for .NET languages. It will help them introduce feature faster and they will be available for all .NET language by default. I’m still not sure how I can use the Roselyn capability in my own code though.

Languages and Framework

I’m happy to see that Reactive Extensions across languages has moved to adopt. I think the concept is good to make application more responsive.

With all the hype around Scala I think I would have to take serious look at it.

AngularJS seems to slowly draw its way up to the center of the radar. There is so many web framework its hard to make a good choice that will last.

Conclusion

There are many other interesting things in the radar and many other that are no longer there but sill very useful. Who knows were all this will lead us.

Renewed MVP from Montreal

mvp_logoAs a renewed MVP amongst others I want to congratulate all renewed MVP especially those from Montreal because I live there. Here is a list of my fellow MVP along with their blog site. Take minute to visit them, you may find some really useful information.

Renewed in July:

  • Vincent Grondin : Blog
  • Guy Barrette : Blog
  • Eric Moreau : Blog
  • Alexandre Brisebois : Blog
  • Chantal Bossé : Blog

Other Montreal MVPs

  • Erik Renaud : Blog
  • Etienne Tremblay : Blog
  • Laurent Duveau : Blog
  • Louis-Philippe Pinsonneault : Blog
  • Maxime Rouiller : Blog
  • Pascal Laurin : Blog
  • Sebastien Lachance : Blog
  • Christian Côté : Blog

If you think your name should be on this list let me know. I may have forgot somebody.

How Mongo DB can solve Event Sourcing versioning (Part 2 of 2)

In the previous post I talked about how to handle data type changes in events. Now we will see how to allow you events to change by adding, removing, renaming or moving any property.

imageThe serialization process of mongo will help us to migrate old events to new format. Serialization process allow some differences between the data and the type. If you class has some new properties that is not available in the source data and your object can be created without those value set then it will work naturally.

Let say we have a simple entry in mongo that look like this:

{
  "Name" : "Foo"
}

If we simply want to add a new property, its not a problem. This object will deserialize well. But if we want to change the default value of the generated property we must do something different. Of course we can set it in the constructor but I prefer not to. I want to separate the default construction of my object from the migration of an existing from the database.

To do that we will introduce the concept of a normalizer. The normalizer will be responsible of migrating data from one version to the next. By applying a chain of normalizer we can move from any old version to the latest. Imagine we have that NomalizationChain attribute that can handle a list of normalizer.

[NormalizationChain(typeof (NormalizeV0ToV1)]
public partial class FooEvent : DomainEvent
{
  public string Name { get; set; }
  public string Description { get; set; }
}

This class is partial so we can easily define its normalizers in another file as nested class:

public partial class FooEvent
{
  public class NormalizeV0ToV1 : IDataNormalizer<FooEvent>
  {
    public FooEvent Normalize(FooEvent source)
    {
      source.Description = String.Empty;
      return source;
    }
  }
}

Using nested class help to reduce the scope of the normalizers classes . There can be many with the same name without interring with each others.

Now we need to write the code to execute the normalizers. To keep things clean we will create a generic static method that will be called by the deserialization process later:

public static class DataNormalizer
{
  [UsedImplicitly]
  public static void Normalize<T>(T data)
    where T : Versionable
  {
    Type entityType = data.GetType();
    NormalizationChainAttribute attribute = 
      entityType.GetCustomAttributes(typeof (NormalizationChainAttribute), true)
                .OfType<NormalizationChainAttribute>()
                .SingleOrDefault();

    if (attribute != null)

    {
      // Create a list of normalizer to apply by skipping previous version
      IDataNormalizer<T>[] normalizers = 
        attribute.NormalizerTypes
                 .Select(Activator.CreateInstance)
                 .OfType<IDataNormalizer<T>>()
                 .Skip(data.Version)
                 .ToArray();

      foreach (var normalizer in normalizers)
      {
        data = normalizer.Normalize(data);
        data.Version = data.SerializationVersion;
      }
    }
  }
}

Mongo uses the ISupportInitialize interface to intercept the serialization/deserialization process. In the previous post I showed you the Versionable base class of DomainEvent. Now we will add it the capacity to call the normalizers:

[Serializable]
public abstract class Versionable : CatchAllSupport, ISupportInitialize
{
  private int? _version;

  public int Version
  {
    get { return _version.GetValueOrDefault(SerializationVersion); }
    set { _version = value; }
  }

  protected internal virtual int SerializationVersion
  {
    get { return 0; }
  }

  #region ISupportInitialize Members

  void ISupportInitialize.BeginInit()
  {
  }

  void ISupportInitialize.EndInit()
  {
    // We must construct the method call by reflection to make sure that
    // the call to Normailze is done with the actual Entity type and not
    // with the base class VersionEntity
    MethodInfo method = typeof(DataNormalizer).GetMethod(GetMethodName(DataNormalizer.Normalize));
    MethodInfo genericMethod = method.MakeGenericMethod(GetType());

    genericMethod.Invoke(this, new object[] {this});
  }

  private string GetMethodName(Action<Versionable> method)
  {
    return method.Method.Name;
  }

  #endregion
}

With that we have everything in place to handle all kind of changes. Let say you want to remove a property on this event:

{
  "Name" : "Foo",
  "Decsription" : "To be Removed"
}

Now you event class can look like this:

[NormalizationChain(typeof (NormalizeV0ToV1)]
public partial class FooEvent : DomainEvent
{
  public string Name { get; set; }

  protected override int SerializationVersion
  {
    get { return 1; }
  }
}

If we use this class to denormalize an old event, the description will be extract and put into the CatchAll property. We have to remove it from there because if we don’t it will be serialized again in the database.

public partial class FooEvent
{
  public class NormalizeV0ToV1 : IDataNormalizer<FooEvent>
  {
    public FooEvent Normalize(FooEvent source)
    {
      source.CatchAll.remove("Description";
      return source;
    }
  }
}

With the combination of add and remove property functionality we can migrate any shape of event to any other. This will allow our event model to evolve over time and keep our system working properly.

How Mongo DB can solve Event Sourcing versioning (Part 1 of 2)

First of all I invite you to read my previous post on How to support changes in your software. It will give you a good heads up on where I’m going with this post.

imageHaving a strategy to support change is a good practice but sometimes it fails on the first change because, well you know, it’s not that easy. I’m a big fan of document database and NoSQL. Mongo DB is a good match for me because it’s easy to install on Windows and use from C# application. I won’t go in to detail on how to create a .NET Application that connect to Mongo. The are plenty of site out there (here, here, here and here) that can help you with that. Document databases are often referred to as schema less database but it’s not completely true. There is a schema, it’s just not as rigid and well defined as traditional databases. The schema used Mongo is based on Json format. For example look at the following Json document. (Only the event payload is shown here and in all other sample)

{
    "Name" : "Logitech wireless mouse",
    "Price" : "29.99$"
}

This document can be the definition of a event in an Event Store. The C# class to handle this event should be:

public class ProductAddedEvent : DomainEvent
{
    public string Name { get; set; }
    public string Price { get; set; }
}

The missing fields are manage by the DomainEventbase class. This is a really simple object to illustrate the concept.

imageNow we realize that we made an error in the definition of the Price field. We want to convert it into a double type to be able to make some calculation with it. In an event sourcing system the events are immutable, they must never be changed. The past is written in stone. So we need to work pretty much like functional programming to evolve our event to a new format. The first thing we need to do is to tag each evolution with a version number. Because all our event derived from the DomainEvent class that is easy. First add a base type to DomainEvent to hold the versioning concerns :

[Serializable]
public abstract class DomainEvent : Versionable
{
  // ...
}

Now take alook at the Versionable class :

[Serializable]
public abstract class Versionable
{
  private int? _version;

  public int Version
  {
    get { return _version.GetValueOrDefault(CurrentVersion); }
    set { _version = value; }
  }

  protected internal virtual int CurrentVersion
  {
    get { return 1; }
  }
}

This will set any new DomainEvent to version 1 by default and provide the ability to change it in the future. The Version field will be serialized and save in Mongo DB. Here is the C# class that define the event payload :

public class FooEvent : DomainEvent
{
    public string Name { get; set; }
    public string Value { get; set; }
}

The serialized version of that class will look like this :

{
    "Version" : 1,
    "Name" : "Foo",
    "Value" : "3.00$"
}

imageNow that we can know which version of the document we are processing we can start thinking about changing the C# class. The goal is to preserve all needed data from one version to the other. Depending on the type of modification different technique can be used. Our first change will be to change the Value field type from string to double. The only good way to do that is to use Mongo BSonSerializer attribute.

public class FooEvent : DomainEvent
{
  public string Name { get; set; }
  [BSonSerializer(typeof(AmountToDoubleSerializer)]
  public double Value { get; set; }
}

public class AmountToDoubleSerializer : BsonBaseSerializer
{
  public override object Deserialize(
    BsonReader bsonReader,
    Type nominalType,
    Type actualType,
    IBsonSerializationOptions options)
  {
    VerifyTypes(nominalType, actualType, typeof(Double));

    BsonType bsonType = bsonReader.GetCurrentBsonType();
    switch (bsonType)
    {
    case BsonType.String:
      string readString = bsonReader.ReadString();
      double value = Double.Parse(readString.Replace("$", ""), CultureInfo.InvariantCulture);
      return value;
    case BsonType.Double:
      return bsonReader.ReadDouble();
    default:
      string message = string.Format("Cannot deserialize BsonString from BsonType {0}.", bsonType);
      throw new FileFormatException(message);
    }
  }

  public override void Serialize(
    BsonWriter bsonWriter,
    Type nominalType,
    object value,
    IBsonSerializationOptions options)
  {
    if (value == null)
    {
      throw new ArgumentNullException("value");
    }

    bsonWriter.WriteDouble((Double)value);
  }
}

The AmountToDoubleSerializer will be able to convert string type to double but also to read and save properties that are already in double format. This will allow for our system to read past events as well as new ones.

In the next post I will show how can we do other transformation such as adding, removing and renaming a property.

How to support changes in your Software

Event Sourcing is a very powerful architectural concept. Pretty much like a bank account statement everything you write on it is considered immutable. There is no way you can change anything in the past. The only option is to add a new entry to fix a previously made error.

imageThe goal of such concept is to never loose any information not even mistakes. In real life we make mistake all the time. In any good entry form the will be a lot of validation in place to limit those mistakes. But even then mistake can happen.

If the only mistake would be data it won’t be too much of a problem. Any bad data entry can be corrected by a compensating entry that does the exact opposite. But what happens when mistakes are structural.

In software development we always have to choose amongst many differents options. Every time we decide between one structure over another we have to live with the consequences of that choice for the rest of the life of our software. The ability to change the structure of our software decrease exponentially as the features are added. For every new feature we multiply the number of ways we can combine it with other existing features.

How can we build a software that will last longer than its competitor? How cae we embrace changes in our software? In a CQRS and Event Sourcing developed software it’s easy. There is only two main parts that are subject to big change: The Read Model and the Domain.

imageChanges in the Read Model are not so bad. Because all Read Models are built by replaying all historical events, it’s easy to flush them and rebuild them from scratch. The real challenge is to maintain domain events integrity across the life time of your application.

As I wrote before, event store is meant to be immutable therefore it shouldn’t change in any way. But what if you realize that you forgot an important information that you need to track in your domain? Worse, what if you need to remove some properties of your event or an event property need to change its type. Of course those changes should not be the norm but, you know, sh*t happens. In a CQRS and Event Sourcing application supported by Domain Driven Design (DDD) it should be possible to allow such changes. The domain itself may evolve over time. Some business rule may be changed or added and they may need more information to be applied.

The domain objects are good for hiding the internal process of business rules but, in order to be able to do their job, they need some external information. Those information will be persisted in the event store so they will be available later to rebuild the entire object state and be ready to accept any new state change.

imageConceptually the domain only need to know the latest structure of each event in the event store. It should be able to apply them as is to its internal state. In fact the event store will contain earlier version of those events. The goal is to threat them as if they are like their latest counterparts. The best way to do this is exactly like how source control system work such as Git or Mercurial. In those SCM each change set is recorded as a delta from the previous state. They records any new or removed lines of code. So to make that work all we need to do is to have a piece of code that manage transition from any version to the next. Then we need to apply those transitions from the last saved version to the latest version.

How can we do that in our events? See my next post to know more about how to do just like that with a Mongo DB event store.

Imagine a world where the past is all and only truth

Palais_de_la_Decouverte_Tyrannosaurus_rex_p1050042 Your computer system must be full of structured and relational databases. You might take regular backup of them if you don’t want to loose any information. Despite all those precautions you loose all in-between state of your information.
If all you care about is the final state it’s not a big deal but there are good chances that you have to answer some questions like:

  • How much time passed between the first item was put in the shopping cart and the completion of the transaction
  • How many times an item was removed from a shopping cart
  • What was the state the purchase order before the crash

If you didn’t implement some ways to trace those events you won’t have any answer to give. Even if you find a way to do it, you will only get information from the moment you put it in place.
I suggest you a new approach to never have to answer “it’s imposible” to your boss when he asks you that kind of question: CQRS and Event Sourcing. The power of this duo comes manly from Event Sourcing. You remember everything that you learned about side effects and why you should do everything possible to avoid them. Here those effects are the only important things. In that kind of system we do not keep data for say, we keep the effect of an action that we call passed events. For example, if we execute the following command:

var command = new AddItemToCart(cartId, itemId);
command.Execute();

The system will produce the following event:

var @event = new ItemAdded(cartId, itemId);
ApplyEvent(@event);

The strength of the system comes when we delete items. We are able to trace those delete as events too:

var command = new RemoveItemFromCart(cartId, cartItemIndex);
command.Execute();

Will produce:

var @event = new ItemRemoved(cartId, cartItemIndex);
ApplyEvent(@event);

In this system, all event derived from a base event:

public class EventBase
{
 public Guid Id { get; set; }
 public int Sequence { get; set; }
 public DateTime TimeStamp { get; set; }
}

A framework class ensure that every events get its date and sequence properties set.
In such system, only events are valuable. They reflect what really happened in the past. Those event will be used to build specific read models for each surrounding system the are interested. Each one will have its own small database to answer to its needs and any changes to that database will only affect this system.
Those read models are only transient representation of the system’s past.