My Path to Advisory Services

The journey from support to advisory

To me everything seemed obvious, surely no one would NOT understand why having the need for services that were business management focused. The concept that having penetration points along all key elements of the lifecycle to me, has been obvious for years.

I thought I’d share some insight into my career to date so the path to today is a bit clearer – I tried to include some of the key highlights! There’s been far more excitement but that’s for another day.

IT as a business because of games?

Continue reading

Making Millions appear

So your writing a business case or trying to justify a crucial or innovative solution that will bring wonders to your/your customers organization. The only problem is it’s going to require money being spent before we get to this wonderful end state. Now I’m sure a number of people have utilized vendor TCO/ROI tools and have been amazed at how the numbers come out by clicking next, next, finish etc.

Well apart from the usual “default” values of 100% benefit on day 1 that they seem to provide they also almost always utilize indirect benefit analysis. Indirect benefit analysis is a way of providing financial numbers for an intangible event. For example: Continue reading

2014 the digital age? haven’t we had that already?

So I read a lot of analyst blogs and CIO top priority lists/predictions etc.

Sometimes what I struggle with is the idea that we as humans progress at anywhere near the rate of technological change. To give an example, I was at MS TVP for a partner Server 2012 launch event where we discussing cloud private/hybrid/public and the features, benefits and impacts this could have on our customers. I raised a question “considering how advanced technology used to be and how it is even greater these days, what do we do about the fact that organization IT maturity seems to be lagging so far behind the technology curve?”

Well there was some humming and pondering and a response of “well that’s a good point!” and then we continued to talk about technology features etc. etc.

My point here is that if we look ahead to the 2014 predictions we will see “mobility”, “BYOD”, “security”, “digital”, “SDDC” and all manner of other lovely acronyms and buzzwords. What I’d like to see is some thought about how we get there.

Let’s take this PDF:

http://www.gartner.com/imagesrv/cio/pdf/cio_agenda_execsum2014.pdf

On page 3 you will see a diagram that outlines something similar to that of a Maturity model. Here we can see that Gartner say that we are between “IT Industrialization” and “digitalization”. I would argue that we are not quite there yet, true in isolated areas in companies I’m sure we could find pockets where we are there, but as a whole I’ve yet to find a large number of customers who I would place at the “we are here” line. Sure from a high level viewpoint and from an Industry perspective I agree with the concept. What I’m failing to see is how we drag ourselves (and yes I believe it will require some heavy lifting) from today’s buzzwords into a reality whereby we are at a “rationalized” maturity which is capable of moving into this “Digitalization” phase.

 

VDI when to use it!

I thought I’d jot down a few notes around the good use cases for VDI. I often hear people sell the idea with a reduction in cost as a key driver. It’s true you can use the VDI initiative to reduce the direct total cost of ownership, but in reality the VDI element is not what causes the reduction in TCO. (I am also assuming we are reducing TCO based on a BASIC level maturity orginisation).

So without using reduced TCO as a driver, when would I recommend VDI?

Continue reading

Inside the mind of a solution – Selecting a general purpose document management system

“I started to write this a while ago… but never got round to publishing….” – Dan

A shared folder will do surely?

Back when I started in IT a network share was KING of the hill when it came to sharing documents and spreadsheets. Moving onwards from FAT based systems we could start giving access to groups (hopefully the correct type for those who have shared the pain when people haven’t used local groups) and having granular permissions both at the share and file system level.

Alas this shared pool of file storage still had many limitations, some can be augmented with search and shadow copy but mainly it’s a large pool, generally only separated by two dimensional folder structures (I’m ignoring access based control and metadata search but again this is an augmentation)

Continue reading

Transformation Services – Effective ITSM vs. “Passport Processes”

In an ideal world our ITSM discipline would be mature enough to not require the use a separate process. A passport document outlining the current service state would already exist in the Configuration Management System, all dependencies would be mapped and would understand support and warranty levels for everything.

Project teams could go to the CMS, the Service Acceptance Criteria (SAC) process would be invoked followed by RFC and Release management ending with a migrated service.

In reality though it is often suggested that a “Passport Process” should be utilised to handle IT transformation activities.

Continue reading

How to backup SCSM 2012 Service Manager Server and Data Warehouse

For those of you who use native backup or for those building the system I’ve put together a quick T-SQL file for backup up the databases:

For the SCSM Service Manager Server

USE ServiceManager GO BACKUP DATABASE ServiceManager TO DISK = ‘c:\backup\ServiceManager.bak’ WITH FORMAT, MEDIANAME = ‘CSQLSERVERBACKUP’, NAME = ‘ServiceManager Full Backup’; GO

For the data warehouse use:

Continue reading

Re-inventing the wheel

Recently I’ve spent some time looking at Microsofts System Center Service Manager 2012. In doing this i’ve been investigating not only the technology aspects but also the service management aspects. Often implementing a service management tool these days would normally be to replace an exiting system, however there are still a number of businesses who do not have tools.

With this in mind i’m developing best practises based on standard frameworks to provide a summarised view on recommended approaches to implementing a new service management tool. Whilst drawing from information in my head is useful i deci to go back to the books. A recent conversation made this statement stand out:
“a fool with a tool is still a fool”

I think its important for us not to underestimate the complexities of implementing or changing a service management tool. I have a tool (i know i just said about tools)that i created based on ITIL’s PMF (process maturity framework) to help assess clients ITSM maturity and give me greater insight into the environment (this tool requires knowledge to be effective).

Both ITIL and TOGAF have guidance on how to make change successful, my advice is to follow the standard rules, however good a product (component in this case) can be, if you don’t have the people, process and technology in alignment then your heading down a path for an upset user base, late nights troubleshooting and most likely increasing call volumes.

So the fruits of my research so far:

  • Use existing reference architectures (ITM tools have been deployed before)
  • Follow the guidance of ITIL, while it may not be specific for this product the advice is sound and having the correct policies, processes and procedures is key.
  • ITIL has tool selection guidance.
  • A framework that is iterative should be used (damming’s based)
  • Understand the as-is not just in terms of technology, but also in terms of people and process.
  • Don’t rush, getting this right should be more important than getting this “installed”
  • Requirements, Communication and Training…you need all three

If your embarking on a ITSM tool greenfield or refresh implementation then good luck, if you follow the well established guidance you should be moving to a good place.

(Here is a reference I came across (SDLC), haven’t had time to read all the content yet but it seems like it has some good material in it – http://msdn.microsoft.com/en-us/library/bb756611.aspx)

BYOD…. to be or not to be…. maybe its not that simple!

I’m still in two minds about what the business benefit is on BYOD initiatives. I’ve just read an article and while interesting (BYOD DRIVES COMMUNISM OUT OF IT) it seems a little one sided. In my mind its not about business unit’s walking off and consuming IT from other providers, its about ensuring that that the services IT provide are aligned with the business. I’ve yet to see someone come out with a statement around BYOD where an actual purpose is drawn out. Sure we can create an 802.11x access path and utilise intelligent access controls with health validity and individual vlan’s per client (all very cool technical stuff) but if this is just to access systems that are accessible in the workplace using corporate facilities which can be secured, governed and controlled I don’t see the benefit to the business other than the removal of gadget buying on the businesses budget.

I’m building a BYOD POC, maybe I will become enlightened….. only time will tell.

Planning to deploy Windows 8? Where do I start?

Understand the as-is state

    Use tools available, MAP, ACT, SCCM and other SAM/Inventory tools…
    Enhancing the end user experience: Compile a list of (10) enhancements that the new deployment will have. e.g. Automated software deployment (group policy), enhanced security (bitlocker), Removal of user administrator rights (Group Policy), improve patch management (monthly restart feature), remove password reset burden from the service desk (Self service reset), providing self service re-deployment capabilities…. The list could go on and on.
    In my mind this list is important, it can be used as a measure of success and also will help focus on key areas of enhancement. Build these from the business and IT discussions, compile them then have IT and the Business sign them off.

So do I know if I was successful?
Often IT projects will run with no formal investigation to establish the success… sure great for the times when things don’t go quite to plan… but its a bit of a head in the sand approach. Once complete and the dust has settled, send out a survey to your user base. Find out if you were successful and take the feedback as a lessons learnt.

Automation, automation, automation… am I clear about this?

Ever since I was tasked to build more than handful of PC’s (migration, refresh or new deployment) I started to automate. While my colleagues would spend hours installing Windows manually or creating ad-hoc ghost images, I was building an unattended install and a syspreped image…. this soon became the standard mechanism and changed the way OS installation was viewed at the company. If you are deploying Windows 8 to more than a handful of devices I would automate. If we take 4 hours (seems to be the accepted figure for client builds) per pc @ 50 pc’s we have 26.7 days of effort.
If we spent a week creating an image and we half the time it takes to deploy we end up with 17.4 days that’s a 9.3 day saving of effort. As you can see even on small scale it makes sense to automate.

I hope this start help to begin to draw a picture of how to approach a Windows upgrade. There is always loads of guidance from Microsoft that’s always been really useful to me so if you get stuck the answers are never too far away.

Windows 8 Planning