The personal ramblings of Jon Carlos mobile & web developer and technology enthusiast.

Tuesday, 28 August 2012

Could not load file or assembly. The located assembly's manifest definition does not match the assembly reference.

13:53 Posted by Jon Carlos No comments
Ran in to an issue today where I had installed 2 packages from nuget which both referenced Newtonsoft.Json but completely different versions.

I put a call out on Twitter and got some great sugestions from @joeriks and @ismailmayat who suggested adding the dependentAssembly tag to my web.config to allow me to run multiple versions of the same DLL.

I work on this for a couple of hours but could not get it working. The project is a nopcommerce project and as it uses a plugin architecture the dlls are not in the bin directory but in a plugins directory. I think one of the reasons I had so much trouble was already a version of Newtonsoft.Json in the bin folder for the core project as well. So no matter what I did I could not reference the correct DLL.

This led me to Google a bit further and find ILMerge, now I remember Matt Brailsford taking about this a while so I checked out his blog and but he was using build events and I did not need this to run multiple times just once so Googled a bit further downloaded ILMerge and started to bundle the packages I had downloaded from nuget.

Once ILMerge is installed open up the command prompt and navigate to your install directory. Once there all you need to do if run the following command and it'll bundle all your dlls in to one.

ilmerge /out:"createsend-dotnet.merged.dll" "createsend-dotnet.dll" "Newtonsoft.Json.dll" /targetplatform:v4,"C:\Windows\Microsoft.NET\Framework\v4.0.30319"

That example is for Campaign Monitors API package remember you may need to add the full directory locations of your dlls, i've stripped them here to make it easuer to read. but the basic break down is like this.



"SECONDARY.dll" You can have as many of these as you like you just add more to the list


Hope this helps for future reference :)

Monday, 27 August 2012

Visual Studio touch file Post-build event

12:29 Posted by Jon Carlos No comments
Quick post as I’ve been working on a nopcommerce site recently and found that when writing plugins you have problems with the plugin cache not updating when you build your new plugin.

It seems there is an issue when you build that MVC does not pickup the newly created dlls until you restart the App Pool.

I found a post that said that you needed to touch the global.asax file to make this happen without recycling the App Pool.

I found this to be the case but got very quickl;y bored of opening and saving the global.asax file so I looked for an automated way yo do it.

In NopCommerce add the following code to you build script on  Nop.Web project and it'll automatically recycle the App Pool when you need it to.

copy /b "$(ProjectDir)global.asax" +,, "$(ProjectDir)"

Wednesday, 4 April 2012

Automated Amazon AWS S3 backup using JetS3t on Mac OS X

13:08 Posted by Jon Carlos 2 comments
I've been doing a review of my backup setup recently and figured that keeping backups on disks in my house was a little lacking in security and redundancy.

Wind, rain, fire & snow could all cause major issues meaning the loss of some memories in photo and video form that would be unacceptable.

So I started to research backup solutions and as I know S3 through my work on a couple of projects I figured this would make sense. Now there are a few options for backing up to S3. I have to admit I've not tested them all but I did find one that site right with my knowledge and setup.

The options:
Arq - I tested Arq a while ago and it worked great however I did not feel the software responded too well to trying to backup my iPhoto library so I uninstalled the trial.
CrashPlan - I did not test this but it was one of the options I considered.
JetS3t - Command line AWS S3 interface & a Java Browser based uploader

So as you may have guessed by the title of the post I chose JetS3t.

It took me a little time to work out what I'd downloaded which is why I'm writing this post as I hope it'll help others.

So when you extract the zip file you'll see a couple of text files README, LICENCE & Build etc... You'll also see some folders the 2 folders I'm going to concentrate on are bin & configs.

bin - This is where all the Bash and Bat scripts are we'll be using these but also writing our own.
configs -  This funnily enough is where the config files are held. These include the place where the AWS Key and Secret Key is kept. I also had a problem with threads failing some times and I had to tweak a variable to stop this from completely canceling the upload.


Extract the Zip

So first off once you've downloaded the zip you'll need to extract it. You may have your own preference but extracted the files to /Users/YOURUSERFOLDER/jets3t/ it made sense as keeping it in other folders did not make much sense.


Edit the Config

So you now need to add your AWS credentials to the bin/ file. Initially I did this and it was not working, it turns out the 2 fields are commented out to start with so remember to remove the # at the start of the line.

# Service Access Key (if commented-out, Synchronize will ask at the prompt)

# Service Secret Key (if commented-out, Synchronize will ask at the prompt)


Using JetS3t

So once your configs are ready open up the Terminal (cmd + Space or Applications/Utilities/) now cd to the folder you extracted jets3t to. In my case /Users/YOURUSERFOLDER/jets3t/bin/

Now test the setup by running the following command in the terminal: UP BackupBucket /Users/YOURUSERFOLDER/Pictures

Change to the name of the bucket to the one you want to Upload to and to the path you want to backup.

DISCLAIMER: This test should be run on an empty bucket, if you run this on a bucket with files in it they will be deleted. I learned this the hard way don't make the same mistake! I'll come on to the fix for this later.

So you should see a list of files in the folder and once that's complete the upload should start. JetS3t provides detailed updates on the current file uploading and estimated time left.


Backing up multiple folders

So now you have a backup command working we'll setup a custom bash script to backup multiple folders and stop that annoying delete once uploaded feature.

So create a new text file in the bin folder, I came up with the imaginative name and past the following code in to the file. Remember to change the bucket name and folder name as you did before.

#!/bin/bash UP YourBucketName TheFolderYouWantToBackup -k >> logs/amazon-s3-FolderName-backup.log

So to break this down a little:

-k - This stops the synchronize command from deleting files once they have been uploaded and will not remove files that no longer exist on your local machine.
>> logs/amazon-s3-FolderName-backup.log - This pushes all the logging information in to the log file. You should review this from time to time or if you have problems with files backing up.

To run the new script you just created in Terminal type

 This will run all the backups you have added to the file. Please note I can't remember if i had to do this but you may need to set the file to be executible.

chmod 0755



So now you have your backup running you'll want to automate it so it happens to a schedule. You'll need to edit your cron file for this.

I'd not done this before and when I started I had no idea how to work with VIM but with a little Googling I worked it out.

In terminal type: crontab -e

Vim should open up now press i to get you in to edit mode, add the following line:

0 0 * * * /Users/YOURUSERFOLDER/jets3t/bin/

Once you've added the line press ctrl + C and type :wq and press enter that will write and quit VIM. You should now see "crontab: installing new crontab" now your done.



You've setup AWS S3 backup with jets3t, configured multiple folder backup with your own bash script and added a cron job so it backups up at Midnight every day.



Sunday, 29 January 2012 MailSettings with no SMTP server

19:56 Posted by Jon Carlos , 1 comment
I was doing some Umbraco v5 testing this weekend and Lee and Sabastiaan were working on a simple contact form. The contact form required MailSettings setup on the machine I was working on.

I don't run an SMTP server or Virtual SMTP server and could not really be bothered to as I just wanted to get everything up and running so i did a bit of Googleing and I found that you can deposit emails to a folder on your computer for sending later.

So I got emails setup in about 30 seconds by using this in my web.config

<smtp deliverymethod="SpecifiedPickupDirectory" from="">
<network host="ignored">
<specifiedpickupdirectory pickupdirectorylocation="C:\email">