For a little more than seven years bbv Software Services AG has been my first employer after my studies at the University of Applied Sciences in Horw, Lucerne. Today it’s time to say goodbye. But before I talk about my new future allow me to take a step back and reflect on my bbv Software Services experience. Continue reading
This Christmas I upgraded my IT infrastructure to the latest and greatest. I bought a new Mac Book Pro Retina 15 inch with 1 TB SDD, 16 GB Ram, NVidia graphics card and latest i7 processor available for Mac Books. This is the detailed spec I currently own:
- 1TB PCIe-based Flash Storage
- 16GB 1600MHz DDR3L SDRAM
- 8GHz Quad-core Intel Core i7, Turbo Boost up to 4.0GHz
- NVIDIA GeForce GT 750M with 2GB of GDDR5 memory
The performance and hardware you get for the price is really awesome. Then I started looking around for good external screens for both reading online content and developing software. My thought was to go for 4k screens. Before started evaluating the screens I looked for specs whether the Mac Book could handle 4k resolution and also multiple screens. My quick research showed that connecting 4k screens is possible although I overlooked one small but important detail which I’ll explain later. First I ordered two Samsung LU28D590DS screens. But a good friend of mine told me that the stand of these screens is so terrible that the screen trembles heavily even when you type. So I cancelled my order. After comparing the prices of the remaining 4k screens I finally settled on the Asus PB287Q. It is “only” a TN-Panel but has a very good price for its capabilities.
When I got the screens I started MacOS and connected both 4k screens over mini DisplayPort (on the thunderbolt connector) to my Mac Book. Before I did that I enabled DisplayPort 1.2 support which is required to support 4k with 60Hz. Then the shock started. Only one screen was working at a time. First I thought there is something wrong with the screens. I switched them off and on, exchanged cables and more. Still only one worked at a time. So I googled again the specs and found the missing piece in the puzzle in a support article. Basically they are saying in order to have 4k with 60Hz you need Multi Stream Support enable on the screen which I did and then only one 4 k screen is supported with 60Hz.
See more on Know Your Meme
Let me think again. I have Thunderbolt2 with approximately 20 Gbps bandwidth. DisplayPort 1.2 needs somewhere around 17 Gbps so this should technically be plenty. I have two Thunderbolt2 ports in my Mac Book and still Mac OS refuses to turn on my additional 4k screen. I couldn’t believe that. After some more research and reading the discussion threads here and here I found out that it is really a driver and not a hardware restriction.
So how can I get around this driver restriction? Nothing simpler than that. You prepare a bootcamp partition, install Windows 8.1, start into the Windows 8.1 on the bootcamp partition, connect both 4k screen to each Thunderbolt2 port, Windows 8.1 detects those display and runs them with 60Hz in full 4k resolution! Here the proof:
Dear Apple: Are you silently saying that MacOs is dead and Windows 8.1 is the future platform for Mac Books?
Thanks to Matt Ellis and some bug hunting of me I can proudly say that we have released Resharper 9 integration for Machine.Specifications. You can now install the plugin by using the provided extension manager in Resharper 9.
For those who are using Resharper versions older than 9 I pushed an update for the plugin for Resharper 8.2. It resolves an issues when you switch from Debug to Release mode or vice versa while having a unit test session open. If you did, you actually had to restart Visual Studio so that the Resharper plugin picked up the correct output path location of the specs assembly. Older Resharper versions are no longer supported as well as the installation via batch files. If you find anyting feel free to open issues on the github repository.
In my last post I showed you the TransactionScope class and how you can write your own enlistments to participate in transactions. The code we wrote was all synchronous. This time we are going deep into the abyss and change our code sample to a completely asynchronous API. Let’s explore how the code could look like: Continue reading
In my last installment I gave a brief overview over Service Bus for Windows Server. In this post I’m going to look at High Availability and why it is important. In my last project my job was to help a team out building a robust and reliable infrastructure which leverages Service Bus for Windows Server. On the first day we sat together and discussed the various questions the team had regarding reliability and availability. The first question started like that: “What can we do in code when the Service Bus for Windows Server is not available?” My answer was the following: “In code? Besides trying to reconnect to the Service Bus a configurable amount of time you cannot really do much. If your most important communication layer is down for a longer period of time your system should detect that problem and gracefully shut down its services. If you are not specifically building for an occasionally connected system your infrastructure needs to be made reliable and available. Trying to solve those concerns in your systems code is a waste of time and money.” We shaked hands, the customer said thank you and I went home. Problem solved without writing a single line of code Just joking!
“Use Messaging to transfer packets of data frequently, immediately, reliably, and asynchronously, using customizable formats. [..]” This quote from the Enterprise Integration Patterns book from Gregor Hohpe and Bobby Woolf shows that one of the fundamental principle of messaging is that the messages need to be transferred immediately and reliable. In order to achieve this our Service Bus infrastructure needs to be reliable and highly available. Because Service Bus for Windows Server is a broker-based transport the producers and consumers rely on the availability of a centralized infrastructure. But what could possibly go wrong? Continue reading
I would consider this blogpost as unnecessary knowledge for most programming tasks you do during your lifetime as a developer. But if you are involved in building libraries or tools used in integration scenarios you might consider this helpful. The .NET platform has under System.Transactions a very nifty class called TransactionScope. The TransactionScope allows you to wrap your database code, your messaging code and sometimes even third-party code (if supported by the third-party library) inside a transaction and only perform the actions when you actually want to commit (or complete) the transaction. As long as all the code inside the TransactionScope is executed on the same thread, all the code on the callstack can participate with the TransactionScope defined in your code. You can nest scopes, create new independent scopes inside a parent transaction scope or even creates clones of a TransactionScope and pass the clone to another thread and join back onto the calling thread but all this is not part of this blogpost. In this blogpost I will cover how to write so called enlistments which can participate with TransactionScope for your own code and and in the next post how to overcome the challenges you’ll face in asynchronous scenarios. Continue reading
Today I read this blog post about how to simplify test data preparation.
The author of the blog post states that setting up test data for tests is sometimes difficult and bloats up the test code, resulting in bad readability and maintainability. I completely agree with that.
The author continues by solving this problem by loading the test data from a file and using it in the test. That minimizes the code needed to set-up the test data, but results in a disconnect between the test and the data or example used for it. Leaving us with an obscure unit test.
We solve this problem differently.
This is a really quick announcement. I recently released Machine.Specifications 0.9.0. With that release I introduced a breaking change: I disabled the console output capturing by accident. If you are using console outputs in your specs and need to see them then I strongly advise you to upgrade to Machine.Specifications 0.9.1. You only need to upgrade the Machine.Specifications nuget package in your solution. None of the other components are affected. This is the beauty of the new architecture
Have fun and sorry for the inconvenience!