All posts

Engineer Show Reel
Monday March 14, 2016 | Lasse Laursen

Show reel's are most commonly associated with directors or actors trying to convince a potential employer that they've got what it takes. But being in academia has slowly had a notion dawn on me. Perhaps I, too, could do well with having a show reel:

In 2014, video apparently accounted for 64% of the worlds web traffic. No surprise really, given how much more efficiently video allows you to communicate a variety of topics. Don't get me wrong, a number of things should definately not be communicated with video. For example, the ninth level of the underworld is undoubtedly reserved for creators behind 3 minute YouTube videos to showcase a 10 second task.

But if you work in a medium where something can be visualized, I would strong suggest using that leverage. Just like resumes, there are a number of useful heuristics I'd recommend you follow if/when putting together your own show reel:

  • Make sure to limit yourself to around 5 minutes or less (ideally 3 or less). Attention is a valuable commodity, and it's preferrable to leave someone wanting more, rather than bored by what's being presented. You'll notice I've crossed the crucial 3 minute threshold myself, but I'll tell you why in a bit.
  • In the best case, your show reel will, just like a resume, focus on the area of work most relevant to the person it's being presented to. However, show reel's don't grow on trees, and I've personally opted for the happy medium of a somewhat broader and longer show reel. For optimal results, I'd edit it down to just the core 2-4 projects that are most relevant.
  • Contrary to directors and actors, you don't need someone to absorb an isolated scene, so I'd advise you to narrate the show reel. Not only does this give you an opportunity to show off your communicative skills, it also gives you an opportunity to highlight the most impressive aspects of your work.

At the end of the day, what it really comes down to - in my opinion - is putting forth more energy, so someone else can save some of theirs. Reading is time consuming, and a show reel can be a good way to quickly and efficiently summarize your previous work. It won't supplant your resume, but if you work with visual projects it can help communicate your work in an expedient manner.

PlanMixPlay release early 2016
Monday September 7, 2015 | Lasse Laursen

PlanMixPlay is undoubtedly the biggest undertaking I've ever undertaken. This is both a positive and a negative in a number of ways. Things of such big scope are hard to complete all alone, because there's no group mentality to keep you working, only the gratification of accomplishing something. The feeling of accomplishment seems quite distant however, when you're at the starting line looking at what seems to be a horizon stretching for eternity. It isn't eternity really. In fact, it's pretty close - but because it is beyond what you can see, it might as well be eternity.

Despite this daunting outlook, these things keep me motivated:

  • If I don't do this, nobody else will. Don't get me wrong - there are lots of people who can do this, and if I don't do it - it will eventually be done I figure. But probably not the way I want it to, and probably not as fast as I want it either. I'd argue that there's a reason 'art' and 'technology' are often separated, and I'd say it's because people often lean into either. It's rare to find someone into both these things beyond a certain degree, than just one. If this is true, it means there are fewer people around who would have the motivation to do this.
  • It's something I'd like to use myself. I know I cannot possibly compete with commercially available performance software. They're building on a legacy of decades with a sizable team, as well as funding to boot. But if I can cook up something that has enough unique appeal to be worthwhile without all the bells and whistles. Well then my friend. We may just have something.

I've decided to post this in order to try and layout some sort of road map for the future. By writing what I intend to build, I'll hopefully get better at sticking to it. So without further ado, let's look at the features I expect to be able to manage before the first official release of PlanMixPlay.

  • Media Engine Re-Write. PlanMixPlay relies heavily on BASS internal timing currently. This is simple and works well, but also means that portions of audio code permeate the video engine code. This is bad, and counter intuitive. A custom built internal timing system is needed.
  • Audience feedback re-integration. This feature needs to be re-introduced in a stable and more workable manner. All the code is basically there, but it needs refactoring to its proper working condition.
  • Audio/Video Handling. Audio should be manipulatable, similar to vinyl. Video should be streamable and not entirely contained in memory.

That's as much as I dare to promise for now. The devil is in the details of creating these features. They're not particularly tricky on their own (apart from audio time stretching), but to integrate them well and without crashing... That's the challenge.

I hope you'll stick around to see me make good on these goals. Pre-Release
Sunday August 9, 2015 | Lasse Laursen

Hey there,

You are among the very few - and I do mean very - who will cast their eyes upon the pre-release of Let me just namedrop that domain one more time so it sticks:

Now that you're here, grab a glass of complementary imaginary champagne, and let me tell you a little bit about my vision. It is at this point that you're free to pretend that you've suddenly spotted someone you know in the party crowd and quickly excuse yourself. You, you... you meanie! OR - you could do the honorable and respectable thing, which is to take a small sip of air and put on your best 'I'm super interested look'. You chose wisely.

Now where was I? Oh yes... The vision. No wait - let's call it an idea. `Vision' sounds too gauche.

A few years ago, I noticed touch surfaces were growing in size, and thought 'Hey - why don't we make a cool DJ interface on that?'. The end.


Well - I did think a few more thought's but that's basically the original motivation for PlanMixPlay. But the whole goal, you could say, is to try and push nightclubs and social gatherings forward. Nightclubs are such an interesting social concept. There are a lot of things that seem weird to me about them. For example, we go there with people we already know, talk to people we've met before, and socialize with as few new people as possible. Depite the fact that nightclubs seems like the ideal place to meet new people. It's rarely the case anymore I feel. Perhaps because clubs used to be run by eccentric party people who did it for the sake of the party, and these days clubs are mostly run by people who do it for the sake of money. I'm not here to dump all over clubowners though. I don't envy anyone running a struggling club as I'm sure it's a seemingly endless battle to stay relevant and popular. To the best of my knowledge, most places (around 90% or more) tend to last for a few years and then are mandatorily shut down, re-tooled and re-opened six months later. All for the sake of `newness' in an ever desparate chase for the ever elusive customer crowd.

Nightlife has become standardized. DJs, bars, dance floors, VIP sections, bouncers, you name it. There's a blueprint for what makes a club and another blueprint for what makes an event.

So why don't we see any new cool stuff in nightclubs? Interaction, live voting, networked socialization, casual gaming. Well. I'd argue there are two reasons for this:

  • There's not enough drive. As a species, we've solved the mystery of how to run a succesful nightclub, and none of it involves anything beyond promotion, events, go-go-dancers, DJs and booze.
  • The average party person is satisfied. Could they be more satisfied? Sure. But why bother if what's on the table is enough?

I bother. That's why. Imagine if we could dissolved the barrier between the performer and the audience a bit. Not remove it completely, but make it permeable. Make it so that you could reach through and they could reach back?

We live in a time where the vast majority has been well catered to for the past decade. Business' like have realized that it's well worth going after the long tail. The endless line of niches that follow their top sellers. There's no reason we can't apply the same mentality to live performances. A portion of your audience wants to interact with you.

Encourage that.


Since my later university days, I've had the fortune of sampling some fairly highpowered laptops when using them for work. However, performance laptops (also known as gaming laptops) are not without their qualms, especially compared to the quite robust desktop gaming pc's. If my blog/website had had any sort of significant following I would probably have used it as a sounding board for my (in my mind, well founded) greivances. Because even though consumers are generally well cared for in Europe, sometimes things do slip through the cracks.

Case in point, my earliest highpowered laptop from DELL, the fateful m17x. This machine caused me a lot of headache back in the day, and I'll spare you the details and present the abridged version. The machine contained both an integrated video card as well as a discrete one. Switching between the two required a reboot. Before you toss up your hands and laugh at this decision - consider that this choice might - theoretically - not have been that silly. Running in 'energy conservation mode' most performance GPU's still suck a lot of power, so while including a secondary integrated card may have made the whole machine heavier and more expensive, theoretically it would yield the best of both worlds. A truly energy efficient integrated graphics card for longer lasting battery sessions, and discrete graphics cards to really yield performance. The reality was that the machine required custom DELL-made nvidia drivers to function properly. Over the course of the machines support life, guess how many drivers DELL issued?


When considering that using SLi (which the machine supported) in any game requires specific driver support, you can quickly see how lacking this triple driver release schedule was. If my memory serves, the three drivers were released over the course of 12 months. Meaning after that, any game released around a year later would not make use of SLi with DELLs mandatory drivers. I ended up having to install the nvidia reference drivers on an older machine just to extract the SLi profiles and then inject them into the DELL drivers, thus making it SLi capable with newer games. When Nvidia then changed the SLi profile format, I was forced to also manually modify the profiles prior to injecting them. The M17x also had a host of performance issues and would lock-up somewhat sporadically (while using DELLs homegrown drivers of course). Three hardware revisions were made, and while a few customers were lucky enough to recieve free upgrades, DELL in Denmark seemed less inclined to help.

The moral of the story is, if you buy a high-powered laptop, make sure that the producer stays close enough to the Nvidia or AMD reference cards to use the native drivers, rather than rely on the hardware producers themselves for driver updates.

Fast forward a few years and I sit here with an aging Origin PC EON17-SLX whose performance has been waning. It sneaks up on you. Games just don't seem to run quite as well anymore - and I really wish I had made some performance measurements the very first day I got the machine to get a comparative baseline. Alas - I did not, so I can only show the current before/after state of things. So let's dive into it.

Before you replace your thermal paste

You really ought to spend a little time to figuring out whether or not thermal paste is the source of your performance problems. All my research hasn't really led me to a conclusive answer so far. Some people seem to think "every two years" is the right answer, and others say it should last "forever". Do yourself a favor and look into how your GPU's are performing before doing anything at all. I use HWiNFO myself, but any free sensor monitoring tool will do. Set it up to monitor both the temparatures and clock frequencies of your GPUs. Here are some of HWiNFO's readings during a recent play session of Shadow of Mordor.

The main GPU is in red (on top), and the secondary GPU is in gray (on the bottom). I've annotated a few things in the image above to make things more clear. The green lines indicate the GPU's maximum clock frequency (on boost), the brown lines show where the temperature is 90 degrees celcius (and clocks down), and the two purple lines approximately indicate where I entered/left the pause menu in the game. Notice how the main GPU struggles at a near constant 90 Degrees clocking down in frequency in order to not meltdown. Only in the pause menu does the machine get to catch its breath. As I leave the pause menu (around the 2nd purple line from the left) you can see the main GPU clocks back up to its maximum frequency of 757.7 Mhz for just a little bit, before the GPU immediately hits 90 degrees and once again clocks down. This is a clear cut case of GPU overheating. If you see something similar when taxing your machine, then replacing your thermal paste might help.

Note: Just because the clock frequency is at maximum, it does not mean that your GPU(s) are performing at maximum capacity. Aside from running a graphics-heavy game, another useful tool is FurMark, which renders some very processor intensive graphics.

Replacing the thermal paste

You can find a ton of step-by-step guide on how to do this online, so I'll just stick to some of the specifics in regards to the Origin Eon17-SLX unit, along with a few things I learned along the way.

Above you can see the backside of the laptop along with some of the tools I used to do the job. Arctic Thermal Paste along with the branded cleaning solution I'm fairly sure you can find a much cheaper replacement for.

Here's the opened laptop, with battery removed. In addition to unplugging your laptop when doing any maintenance, removing the battery is also a good idea. A few guides will suggest wearing various anti-static wrist guards or other safeguards. While I'm sure charged static can lead to damage on electric equipment you may be servicing, its something I personally don't take any precaution regarding. But I also do not seem to gain any static charge while in my working environment, so if you do seem to attract a charge, make sure to discharge it prior to working on your laptop, i.e. touch something made of metal.

Having removed most of the pre-existing thermal paste using the cleaning products and a significant number of coffee filters, here's what the CPU looks like with a bit of thermal paste applied. A few things to note here. When cleaning either the CPU or GPU, coffee filters are often recommended as a replacement for a lint-free cloth. To my understanding you can use most any cloth for the first few passes, as long as you reserve the final cleaning passes for either a lint-free cloth or a coffee filter. Many sites will suggest using a particular method to apply the thermal paste. I'd suggest having a look at one of the many cool YouTube videos convering how thermal paste tends to spread to get a good idea of which method to use. Note, some methods (like the dot method) are probably better for beginners and tend to do the job well.

Here's a good shot of what your GPU should look like once you've completed the cleaning process. Nice and smooth. The mobile camera had a lot of trouble focussing on the chip itself as it tended to reflect all the light that shone on its surface

Here's a small amount of thermal paste applied to the GPU. I'd say it might be a bit more than optimal, but after having completed this process about 3 times, I'd say it's good enough. To the best of my knowledge, the worst thing that can happen if you apply too little, or too much is that the unit fails to cool properly, resulting in an emergancy shut-down. While this isn't exactly ideal, it does not result in any permanent damage to the GPU/CPU as far as I know.

Having the replaced the thermal paste approximately three times to ensure I got it right, I'd love to say that performance was much improved and things are better than ever. Except they're not. I've definately noticed and improvement, but not as much as I had hoped. The machine still overheats, and I'm beginning to think it's just hardware fatigue setting in. The only thing left would seem to be the fans, and while they aren't blaring away, they certainly aren't silent either. The vast difference that I am monitoring between the master and slave GPU would seem to indicate that the master GPU simply fails too cool properly anymore. Perhaps due to internal reasons?

I've considered switching the GPUs to see if the same situation presents itself and if it really just is an issue with one of the GPUs, or perhaps the attached heatsink, but given that the hardware is nearly 3 years old by now and likely to be replaced soon, it hardly seems worth the effort. That being said, I may very well opt for a non-SLi laptop in the future. The extra weight and heat does come with a cost it would seem.

A while back I wondered about the best practices in regards to database naming conventions. In my search I happened to come across an article by Robert Pittenger. I read it and found it made some sense to me, enough so that I've adopted a number of the naming conventions he introduces. Having recently rebuilt this website using Laravel I - like a few others - ran into some issues with trying to change the standard field/column names in the 'users' table. So let's look at what needs to be done.

Note: I assume that the default authentication driver you use is eloquent and not database, as requires a few different changes.

For the sake of full disclosure, a few vocal supporters would argue that this is a silly endeavor and that venturing forth with these changes is going against the whole purpose of using a pre-existing framework. Personally, I do not agree with this argument, and think that when it comes to something like database names the framework should absolutely yield and allow this to be as customizable as possible. Being able to change this makes the framework better. Finally, this is of course only necessary if you - like me - are lazy and would like to make use of Laravels integrated authentication code. If you're building that from scratch, you really needn't bother with any of this. Anyway - I digress; we'd like to change the following 5 things:

  • The table name from 'users' to 'tblUser'
  • The 'id' field to 'useID'
  • The 'username' field to 'useName'
  • The 'email' field to 'useEmail'
  • The 'password' field to 'usePassword'

You may have noticed I don't bother changing the timestamp fields. That's because my OCD only takes me so far.

Changing the table name is a cinch. Simply open the provided /app/User.php model file and change the associated table name thusly:

	//protected $table = 'users';
	protected $table = 'tblUser';

Chaning the 'id' field is also fairly straight forward. We simply need to override the $primaryKey member that the class inherits as such:

    protected $primaryKey = 'useID';


Changing the 'username' field doesn't require any re-coding (except for one part I note at the end), so just rename that field.

Now things get a little trickier, as both the 'email' and 'password' fields are a little more deeply integrated into Laravels existing authentication code. But not so much that a few quick overrides can't help us out. Looking at the provided login view (located at /resources/views/auth/login.blade.php) we obviously need to change the name of the input field for 'email', to 'useEmail'. Naturally one would also want to change the 'password' field to 'usePassword', but hold off on that for now.

If we use the page, we'll run into some validation errors as the existing back-end code still expects a filled field entitled 'email'. If you open up the /app/Http/Controllers/Auth/AuthController.php file, nothing immediately stands out as problematic, but that's because all the heavy lifting is actually handled by the 'AuthenticatesAndRegistersUsers' class. The 'postLogin' function is the problem, but since editing vendor files will cause all kinds of headaches (when you eventually update), make a copy of it and place it in the 'AuthController' class thus overriding the inherited functionality. With a few changes we'll now be referencing the altered field name:

    // Override to use 'useEmail'
    public function postLogin(Request $request)
        $this->validate($request, [
            'useEmail' => 'required|email', 'password' => 'required',

        $credentials = $request->only('useEmail', 'password');

        if ($this->auth->attempt($credentials, $request->has('remember')))
            return redirect()->intended($this->redirectPath());

        return redirect($this->loginPath())
            ->withInput($request->only('useEmail', 'remember'))
                'useEmail' => $this->getFailedLoginMessage(),

Four down, one to go. You'll notice the 'password' field name has remained unchanged. Well, that's because it's more integrated than 'email' field was, and there's a very simple surface-level fix that'll take care of everything. Back to the /app/User.php file, and add the following override:

    // Override required, otherwise existing Authentication system will not match credentials
    public function getAuthPassword()
        return $this->usePassword;

This will now provide Laravel with the proper password credentials stemming from our renamed field.

Finally, make sure to also update the $fillable and $hidden arrays with the new field names and you should be good to go!

Fields named however you want them and Laravels useful authentication remains in-tact!

© Lasse Laursen 2015 - 2021