Showing posts with label Geek. Show all posts
Showing posts with label Geek. Show all posts

Thursday, 15 November 2012

Software Bucket List–Nov 2012

After reading Rob Maher’s blog post about the concept of software bucket list it got me thinking.

Let’s take a step back for a minute; what’s a bucket list? A bucket list is a list of all the things you want to do before you pass away. So a software bucket list for a developer is a list of software you’d ideally like to write before you retire. As Rob says it can be anything, not learn this or learn that etc. but what would you like to do?

So this brings me back to me, his blog post got me thinking. The whole idea of the list is that it will evolve over time but as of November 2012 this is my list:

  • Integrated disc golf membership system with result submission with associated automated renewals etc.
  • Work for and write software for a Formula 1 team
  • Multi player strategy re-write (modernise) of an old game we use to play in the early 00’s – Europa
  • Build <insert description here> to enable working from home for my own company – idea still hasn’t been thought of yet.

The last one doesn’t quite fit but it’s more a target to keep me inspired about moving forward.

Anyway, this is just a start and now it’s a seed that’s planted I’m sure I’ll come up with more but for now what’s yours?

Saturday, 9 January 2010

Cheap remote backup solution

After reading the madness of what happened to Phil Haack a few weeks ago and a few other people who had VPS’ running on the same machine after a physical hard drive failure everyone on the box (so it seemed) realised that they didn’t have much, if any, backup solution in place. This got me thinking …

There are a number of remote backup solutions for remote servers but they cost money which at the current time I don’t have, and most people don’t have due to the current economical climate. So was checking my email the other day and scrolled to the bottom of the page and noticed that I was only using a small amount of the my gmail account storage. It got me thinking that there was a plug in you could get to use your gmail account a remote storage / mapped network drive. This could be the reliable backup solution I’m looking for.

After doing a bit of Googling about to see if there was anything out there before I set down to write my own solution to this problem I found backup2Gmail on Code Project. Once I realised that you could run this from a command line then I was on to a winner.

So the backup bat file goes along the lines of:

Backup the svn respository using svnadmin to a backup folder (c:\backup\svn)

Checkout the latest code from the repository to a folder in the backup folder (c:\backup\code)

Run the backup 2 gmail command line to backup the c:\backup folder. It then zips it up and sends it to my account.

And that’s it. Put all that into a bat file, put some error checking in to make sure the dirs which are required are there at the beginner, removed at the end etc. and then set it as a scheduled task for some crazy time early in the day (currently set at 3am).

Hope this helps someone else in the future to avoid the heart ache of server failure.

Tuesday, 1 December 2009

WCF and serializing custom objects

I’ve been setting up an wcf service for a new n-tier system which I am currently architecting and developing at work. I’m trying to get to the point where all tiers store the data in the same objects (Entities) and they are worked on at different levels. These are simple POCO Entity objects which only store the data values and defined the data annotations to be used with validation. This will require the WCF service to be able to serialize the custom objects and transmit them between the tiers and in future, who knows maybe to a Silverlight client application as well?!

So, with the objects in place and with the correct DataContract and DataMember attributes in place I get the following error message when trying to pass them through the service between tiers:

“The underlying connection was closed: The connection was closed unexpectedly.”

After doing some Googling I came across the following blog post which had some handy pointers … thanks Bishoy Labib. But the biggest help was from a post by Damir Dobric. His post explaining how the "KnownTypeAttribute” is used when sending through data over a wcf service was very handy.

So after decorating the interface with each type of Entity I had which (3 so far) I built the project, got the service reference to update to get the latest definition and ran it with fingers crossed … it worked!

Damir had adding each of the types which might be used through the WCF service interface defined individually, then refactored it by adding them through the KnownTypeContainer (similar to below) and added them manually. This wouldn’t quite work for me as there are going to be, probably, lots of entities and I don’t want to have to add one individually each time. As all the entities are defined in a single project, I thought with a little reflection on the Assembly I could dynamically load them in so came up with this …

[ServiceKnownType("GetAllMyKnownTypes", typeof(KnownTypeContainer))] 
[ServiceContract(Namespace
= "http://mynamespace/2009/IApplication")]
public interface IApplication
{
[OperationContract]
string Echo(string value);

[OperationContract]
Entities.EntityBase Execute(
string action,
Dictionary
<string, object> parameters);
}

static class KnownTypeContainer
{
public static IEnumerable<Type>
GetAllMyKnownTypes(ICustomAttributeProvider p)
{
return new List<Type>(
Assembly.Load(
"Entities").GetTypes()
);
}
}




Nothing special, or clever, just need to make sure that all the Entities derive from the common EntityBase abstract class to work.



Thanks go to Damir for the original post and getting me through my issue I was having :-)

Wednesday, 18 November 2009

Asp.Net MVC 2 Beta released

After checking my morning blog feeds the first one in the list which was of interest was Phil Haack’s post about the release of Asp.Net MVC 2 Beta for VS 2008. I was going to put off downloading it and updating until later but after the teaser about nuclear facilities in the Eula I had to investigate …

This was the full Use rights which he was referring to:

a. Because the software is a pre-release version, and may not work correctly, you may not use it, alone and/or in conjunction with other programs in hazardous environments requiring fail-safe controls, including without limitation, the design, construction, maintenance or operation of nuclear facilities, aircraft navigation or communication systems, air traffic control, and life support or weapons systems.

I should read EULAs more often; Genius!

Tuesday, 10 November 2009

Windows 7 Aero snap shortcuts

I was browsing through my regular blog feeds this morning and the tweets which had happened over night this morning and I came across the following link. It’s posted in the context of Visual Studio 2010 Beta 2, however it works for any window which is currently selected.

As I use a duel screen setup at work and manually dragging/snapping the windows to the side of the screen only work at the far edges of the entire desktop real estate these short cuts enable halving both screens spot on.

The short cuts which are starting to be high up on my most used list are:

Dock to Screen Left : Windows + Left Arrow
Dock to Screen Right: Windows + Right Arrow

I’m lovin’ Windows 7!

Monday, 9 November 2009

Have people not heard how to use namespaces?!?!

I’m trying to work out how something works and it’s over mulitple dlls. This is fine in the grand scheme of things. They aren’t large dlls either and the names of them gave me hope that it would be well designed. One has the data access code, one with common code and one with the business rules. Three projects working together isn’t large at all. I’ve worked with Visual Studio solutions with almost 100 projects in, so how hard could this be … ?!

So far so good …

That was until I started looking at the names of the class definitions. This is the point when I wanted to bang my head on the desk!

In the data access dll, it has a data access namespace, yet all the class definitions are prefixed with ‘dal’.

Why?!

In the business rules dll, it has a business rules namespace, yet all the class definitions are prefixed with ‘br’.

No seriously, why?

And to top it all off, the data entities are in the the 3rd dll, all post fixed with ‘Data’ … this in itself isn’t bad, except the namespace is ‘Data’ so that kinda makes the post fix redundant.

And don’t get me started on the name of the dll … and no it doesn’t even have the word ‘data’ in it (see the opening paragraph ;-))

Why would you do that?!?!

*hithead*

I’m on Twitter!!

I’ve signed up to the revolution, you can follow me @WestDiscGolf.

What are you up to today? :-)

Wednesday, 4 November 2009

Visual Studio 2010 Beta 2 initial thoughts

Just thought I’d post a small brief entry about my initial thoughts of Visual Studio 2010 Beta 2. So far so good I like it, the layout is nice, the tooling improvements are good, the response speed of the IDE has improved quite a lot … however, I won’t be using it full time until there is a version of Resharper which works with it; I’m lost without the R# power!

:-)

Wednesday, 21 October 2009

Windows 7 used in anger

I’ve been using Windows 7 from the RC now on my Mac Mini for a couple of weeks. I’ve got all the usually required applications installed; Firefox, Visual Studio etc. and have been relatively impressed by the experience.

On starting my new job last week I was given a pretty powerful Dell Vostro 1720. Its hasn’t got the highest of build quality, but it is packed fully of goodies including 2.6Ghz duel core cpu, 8gb ram, 512mb dedicated graphics card with a 1920 x 1200 resolution 17inch screen which is very crisp.

Anyway, I digress … at work I’ve got an MSDN subscription which allowed be to download the released version of Windows 7 64-bit and install it. I’ve now been using it close on a week and it feels like forever. The user experience is good and it feels very natural to go to from Windows XP. I had used Vista a couple of times before and just hated it with a passion. It was just slow and clunky.

I know the user experience including the speed is down to the components in the machine, but it runs in RC on my Mac Mini at home pretty comfortably and that doesn’t have much power or ram. I’ve also seen people running it on netbooks, something you wouldn’t even think about doing with Vista. Due to being pretty impressed so far with the OS I thought I’d put together a top n things I like about Windows 7. It would also give me the opportunity to look back on it in a few months time and see if I’m still impressed by it and if its still the same things which are good.

Top 4 “likes” about Windows 7

1. Speed  - The general speed of it is good. Quick recovery from hibernation, good reboot speed, general usage gives good response time.

2. Taskbar organisation – I love the way the task bar is organised. The pinning of applications to it, keeping all the icons together, being able to drag and drop them into which ever order you like is brilliant.

3. Multiple Monitor support – Brings an all new meaning to ‘plug ‘n’ play’ support. Plug in your second monitor for the first time, configure it correctly (location, resolution etc.) and thats it; done! Un plug it, take your laptop home, come back in, plug the monitor back in and it auto recognises it, goes back to the previous setup. No pushing laptop function buttons and waiting for the screen to refresh!

4. Window docking – There are little applications out there to automatically set the active window to half the screen, move it left and right etc. but the built in window docking in Windows 7 does for me. The high screen real estate which I’ve not got makes putting applications half screen usable … just drag them over to the side of the screen you want and *poof* automatically half the screen. My only niggle about this functionality is that when you’ve got 2 monitors it acts like a single screen so you can’t dock left and right on both screens which is a bit of a short fall, but other than that all good!

Thursday, 4 December 2008

Visual Studio Solution Structure

I've been playing about with Visual Studio 2008 and the new features of c#3 on and off for a couple of weeks and lining up to get a couple of small projects up and running. One of them is my own website, one is a survey for next season for TDs and participants in national tour events.

I wanted to get the solution construction right for my website solution so that I can add a lot of stuff in over time and not have to re-work it to fit more things in. After setting up a subversion source control server for personal usage I decided that this was the time to mess about and I can always roll back bits if something goes wrong.

I wanted to keep the structure so that it was separated and also unit testable (yes I'm a unit test geek). This lead to having separate projects in the solution into different areas ... Data access, Model classes, Services and UI (using Solution folders works wonders for this ... more later).

*Top tip* To get your namespaces as you'd like them start by naming the projects as you'd like the namespaces to be. A good starting point is . as all the files in the projects will start with that namespace as standard unless you change them.

The data access project is self explanatory. I've decided to start using/playing with the Microsoft Entity Framework for this. I'm still learning how the EF works and best practices and I'm sure I'm still not using this properly but ah well; we live and learn! I've been reading about the Linq2Sql vs EF madness which has been going on over the past month since PDC, but I don't think it'll affect a little single "out of hours" home coder.

The model project is where all the domain model classes go. I'm still trying to work out how this goes with the EF in the data project so if any one can point out how these should operate together *properly* then some pointers would be much appreciated.

The service project is there for a wcf service project. I've put this in place so that I can access the data asynchronously via javascript in the future for an ajax look and feel with responsive UI and to avoid unnecessary postbacks.

UI is made up of a couple project types. The main one is a MS MVC project template which will serve as the main "website" in a traditional sense. It is made up of pages or views which will display different images / inputs / data and will also serve as host pages for the second UI project type; a silverlight 2 project. I've added this in as I want to start playing about with silverlight, and the eventual plan is to do my own site in both postback/views way, responsive UI javascript/async calls way and to have a RIA with silverlight.

To enable the solution to be grouped into logical I have used Solution folders to group the project and associated test project together. Solution folders are not related to the file structure they are purely there for organisational purposes. They also have more functions which can be found my reading the above link.

In addition to the grouped projects I have also added in a separate class projects for extension methods and global definitions. These projects will be referenced by one or more projects and will not reference any other project (as circular references are bad!).

Another tip I would use if you are going to go down the Unit test route would be add a file structure folder in a logical place in the solution and add in your mocking / helper dlls in this folder. You can then add in single references into each unit test project to one set of dlls. If you are putting your code into a source control system then add them into your solution and file them away into a "External" solution folder so that you can hide them away but still have them "under control".

Well this is the structure I am going to work with for a while and see how it gets on. You have any thoughts on the matter or pointers to improve this structure then please let me know.

This is the first post I've done of a technical nature so any comments about that would also be appreciated :-)

Friday, 7 November 2008

Server / computer naming conventions

I've worked in a few places which have cool (in a geeky way) names for naming computers and servers on their corporate network and I didn't quite realise the extent that people go to when doing this until reading a question / answer on StackOverflow which was Coolest Server Names.

At the first IT job I had the servers and computers where named after characters from the Muppetts. Our main server was called Gonzo and then when the mail server got its own box it was piggy (after Miss Piggy). Our workstations were the usual Kermit, Fozzie and Beaker. After I had left and they had moved office they started again with Simpsons characters. At my current job we use to only have elements on the periodic table. I started with Paladium and then Iodine. The dev servers are technetium, tellurium and tantalum.

Why am I posting this? Well On the answers on StackOverflow the top rated answer I thought was great so thought I'd post it here ...
The funniest server name story I have is from when I worked at the Kennedy Space Center. On our particular project, our main server was named snowwhite, and the 7 client workstations were named after the Seven Dwarves. The kicker is, one day one of our engineers ran into a Disney Imagineer who worked at Walt Disney World, and they started talking about server names. The Disney Imagineer said "that's funny, we have a group of servers named columbia, challenger, atlantis, and discovery."

Thursday, 6 November 2008

It's taken over 3 years, but I'm 1337!

Time for some geekiness, but today I became 1337 on the BDGA forum!!

Wednesday, 10 September 2008

Rebuilding a mac for the first time

After having a little mac mini for over 18 months this evening it's come down to being rebuilt for the first time!

Before getting my mini I'd not used a mac since year 10 work experience when I went to a graphic design company and back then all good graphics software and design was done on the mac os. Back then I think it was version 6 so very much in the PowerPC range of computers. But since then in the past couple of years the PowerPC strangle hold on the mac has relaxed and Intel broke in.

The main reason why I got a mac mini in the first place wasn't because of the new Mac OSX operating system and it wasn't because I was overly looking for stability (although it's nice) ... the reason why I got it was because it was incredibly small, very very quiet and still pretty powerful for what I needed.

I'm not into gaming that much so never needed a beast of a machine with awesome graphics to play the latest games. All I wanted was a machine which I could program on, watch dvds, download stuff and surf the net on. Not much of asking I thought. With the advent of Intel duel core chip being inside it also opened up the possibility of running windows on the machine as well.

I played around with Boot Camp when it came out and installed Vista. Vista performance rating gave it a 3.4 (I think) which was ok; but then looking at the break down of performance it was only that low because of the integrated graphics. The duel core 1.8 Ghz cpu scored 4.5 - 4.7, the 2GB of ram scored highly ... even the speed and performance of the hdd was good. So I was happy!

Anyway, once this is done I'll have the latest build of boot camp to play with and windows will be going back on. The reason ... purely for remote access from work so I can make sure my downloads are good and I can access visual studio 2008 (as we've not upgrade yet ... still waiting!).

But so far so good the install has gone well :-)

Wednesday, 9 July 2008

Knight Rider Sat Nav

Its finally been confirmed that the Knight Rider style sat nav is being released ... altho in the US :-(

Details (well the limited info thats been released) can be found on the Register
with more details to follow hopefully. This will however mean that everyones childhood fantasy about having KITT as their own is coming closer to reality.