New rear shocks.

I was digging around at the back of the car, and found some moisture on both the rear shocks – not something to worry about if you’ve been driving the car on wet roads, which I hadn’t.

Yeah, both rear shocks were failing, and had begun to leak, time to replace them or risk an inevitable MOT failure.

Thankfully Olivia, is a Quattro, it actually makes doing this work a little easier IMHO.

Coffee shop relaxation

This is the result of what I was going to blog when I was relaxing and having a coffee and sarnie, by the way of a very late lunch after a morning TKD session. (that incidentally was a sweat box, but very much fun 😀 )

Well, I thought I’d attempt the classic techie shopping and lunch trip – and by the by, have a nice coffee in a random coffee shop and do some surfing.

So, after getting one or two little items, which was a little less than I had thought, or wanted, but one store didn’t have what I was looking for, I thought I would sit down in MK’s Starbucks or Costa, and enjoy some gentle caffeine enhanced relaxation.

So – first off, I wanted Internet, you know, catch up on some of the news I’ve missed, maybe even write a blog post about something, simple enough I guess, my MacBook connected to “_The_Cloud” – but then I can’t surf without signing up – or remembering what I signed up with on my iPhone what seems like a billion years ago, and connecting and resetting my password would mean I needed access to email – which requires Internet access…. which.. yeah, ok see the problem ? (Astute readers will say why didn’t I read the email off my phone ? – well…… )

OK then, lets try my iPhone itself, and tether.

4 attempts at connecting, ok, I can’t whinge too loudly at that, I haven’t connected this MacBook to my iPhone (did with my old MacBook)

A tiny glimpse of Internet via my Galaxy SIII, so I kick off an update of a plugin on a website of min, but…… then nothing.

So that is Problem 2 – Vodafone, whilst saying I have 4 bars, and 3G signal (this being an iPhone 4S – no 4G), I have now lost the Internet. (so, back to problem about getting email….)

OK – try again, with my Samsumg Galaxy SIII mini, and although this connected first time (but, I have used that phone on this MacBook whilst I was on holiday last week), but, again, although the service provider is saying decent signal – I’ve got no Internet.

And the biggest pain in the arse of all – I didn’t charge any of my devices last night, so whilst I am now currently writing this on a MacBoook Pro in Pages instead of online on my blog, I’m doing it with only 11% of battery 🙁

OK – it reckons 0:39 hours left, which should mean I make it to the end of my steak and cheese sarnie and coffee, but, pretty much bugger all else.

So I’ve dropped the backlit keyboard off, and dropped the screen brightness down to a point where it’s dark enough to almost think the text is dark grey on a slightly lighter dark grey – and the battery life has now dropped to 0:35 remaining.

And the update of a plugin on my site, still hasn’t finished –  the connection I have, tenuous at best, hasn’t been fast enough to update the page quick enough before I was careless, and lost the Internet.

So what I have done with this blog entry, is write it in Pages, so that I can get my thoughts down, and will upload when I get home, to something approaching a broadband connection.

Question is, will I get to the end of writing the blog entry before the battery on the MackBook dies ?

Chances are actually pretty good, it’s a decent bit of kit to be fair, but technology, batteries especially, don’t last forever.

Now my iPhone is at 15%, and GSIII is at 21%, and this MacBook is 7%, I’ve finished my coffee, and more than ready to go home.

So, technology makes things easier ?

Hmmmm, today, not so convinced, although I did want a nice relaxing time in a coffee shop, I got one, just not in the way I had envisaged.

I would have been better off reading a book, but my Kindle isn’t charged……

Apple Time Capsule Repair (Part 1)

Yes, of course, I am talking about an Apple Time Capsule, which, as we all know, are a little too prone to expiring a little young. Well, I am an owner of one of these afflicted devices, the downside is that I used it both as a backup device, and as a wireless access point, so, in one go, I’ve lost two useful devices. Well, as my alter ego MacGyver has a knack of resurrecting dead hardware and technology, I figured I would try some to work some Lazarus style resurrection on my Time Capsule. There are a number of sites on the Internet that deal with the results of the failure, and have a number of different ways of resurrecting the device, from the sensible, to the frankly insane (external wall-wart with a butchered power socket). After a bit of research I have found that two sites give a broadly similar method of repair,  that of Chris Fackrell, and Lapastenague, and therefore embarked on my journey. First off, when I came to dismantling the TC, I admit I was a little impatient, primarily because I was rather annoyed about the device failing, and therefore I wasn’t taking as much care as I possible could have done, and during the process of removing the rubber foot on the device, I tore the rubber 🙁   Overall this doesn’t make a huge difference in the finished repaired article, but, it’s just a known annoyance from my perspective, I will live with it.

Capacitor Plague
Capacitor Plague

So, rubber foot removed, and screws pulled out from the bottom, and everything now nicely exposed, I got to what everyone was saying was the root cause – a fried power supply. They were right. I had fried capacitors on the PSU module. JOY. It’s not the first time I have had a device with fried capacitors, if anyone does a Google search for “capacitor plague” then you will find out the extent of the problem in recent years, and some potential theories on the root cause. Frankly, I don’t give a hoot about the precise root cause, unless it’s exacerbated by some pretty piss-poor design, and in the case of the Apple TC, that is precisely what we have. The TC is a device that has 3 modules, a PSU, a hard drive, and a system board, and all three modules generate heat by due nature of having electricity running through them.

The hard drive, in my case, a Western Digital Black Series, is designed for high performance devices, with barely a secondary care for heat generation. It’s the sort of hard drive designed for those people that like to have oversized desktop cases with lots of colourful fans inside to keep the PC inside cool. (Hey – I was one of that crowd at one point in my life – so I won’t knock ’em) Let’s put it another way, the drive definitely isn’t a Western Digital Green Series, designed for low power and low heat generation, the kind of drive required for such a little device such as the Apple TC. Another issue, where is the cooling for the TC ? Apple, in their infinite wisdom included  a fan inside the TC, but there are a couple of serious flaws with that in itself.

Firstly, and I would say, foremost, a fan is designed to move air around – but in the case of the TC, where is it going to get air from ? No-where, the aforementioned rubber foot on the device covers all the holes in the bottom of the case, and, the holes in the bottom of the case are so few and far between (and so small) that the are essentially a decorative option in the aluminium panel at best (if the foot wasn’t covering them!).

The second problem – it “blows” air directly at the hard drive – the side of the hard drive, at the end furthest away from the power connectors – and there is, at best, a 2mm gap between the fan and the hard drive – so no where for the “air” to go to go over the surface of the hard drive, which would make the fan somewhat more useful.

And thirdly – I’ve never heard the fan come on…….. ….ever…. *UNTIL* I pulled the cable off the heat sensor that is attached to the hard drive, then the TC sounds like a muffled jet as the fan comes on at full chat, and the lovely green LED on the front goes orange, and the TC warns you via AirPort utility that the TC might be overheating. (more about that later – at the moment, I have a dead TC) Bollocks, and definitely no dogs involved. So, what am I going to do to get this thing working again ?

Well, following some of Chris Fackrell’s instructions, I start to look at getting the casing off the PSU module, whereupon, the thin black plastic takes on the structural consistency of a Cadbury’s flake, so getting something resembling a secure and safe “casing” to the PSU module back together post repair was nigh-on impossible, and unlike Chris, I didn’t fancy hand cutting some special sheeting to fit around the module.

So – I took a chance, a quick scoot round a famous auction site (yes, ok then, eBay), I found  a TC PSU in USA, so quickly entered my PayPal details, and somewhere around the 2 week mark, I had a nice little package waiting for my when I got home from work. First thing I did, was to assemble the PSU module in the TC, and fire the thing up without the case on (carefully – as there is a small risk of zapping oneself on mains level voltage if one is unduly careless) – and as this had passed the “not blow up and spit lots of broken electrics and magic smoke all over the place” test, I went the step further, loosely put the case back on, and plugged in an ethernet cable, and fired up my MacBook to see if everything could see each other, and, thankfully they could – at least the motherboard in the TC worked, and I could, at least in theory now, pull any data off the hard drive. Next step, well, as the TC is a flawed design, heat-wise, I started to look at what I could do to alleviate this issue, and both Christ Fackrell and LaPastenague have some fairly detailed method for solving this, I thought that I would have a go at engineering a longer term “fix”.

Drilling hole in aluminium base plate of TC
Drilling hole in aluminium base plate of TC

So, both agree that the fan is useless, so, lets fix that first, well, we do have to start somewhere  – right ? and at least lets start with something that will help prolong the replacement PSU’s life. I engineered a hole, 40mm (ish) in the bottom of the aluminium plate that forms the bottom of the case on the TC. I didn’t drill this with the rubber foot attached, I didn’t know exactly what the big drill bit would do to the rubber itself, didn’t fancy the teeth catching the rubber and ripping it to shreds, and leaving me with a completely destroyed rubber foot – I’d take it with a tear, as honestly, one can’t see it, when the system is in operation, it is on the underside after all – and only I will know it’s there 🙂

 

Rubber foot, Circular hole
Rubber foot, Circular hole

Now, in line with Chris’ details, and LaPastenague’s photo’s regarding the fan holes, they have quite clearly decided to prevent the ingress of (large) foreign bodies into the system by putting a grill over the fan opening, so, one auction site purchase (eBay) later, I had some stainless steel grill material to go over my newly engineered hole in the aluminium plate. At this moment in time, I took the opportunity to temporarily re-attach the rubber foot, and used a Stanley knife blade to cut out the circular hole in the rubber foot using the hole I had just drilled/cut out as the template for the Stanley blade. I also took this opportunity to make another couple of modifications to the rubber foot.

 

Rubber foot, lots os screw holes.
Rubber foot, lots os screw holes.

Firstly, I figured that I was likely to need to pull the whole thing apart again at some point, and,  therefore, I needed an easier way of getting at the insides of the device, and I don’t fancy pulling the rubber foot off the metal base plate again in any hurry – so getting yet more inspiration (to save the perspiration) from Chris’ breakdown, I used a handy leather punch to “engineer” a series of tiny holes in the rubber foot, precisely where the 10 screws are used to hold the base plate onto the TC chassis. Keeping the little rubber circles that you have cut out, also (again per Chris) says, makes the whole thing look tidy and professional (as much as a hacked hole in the bottom of Apple gear looks) when everything is put back together again.

 

Using external HDD case to give me a nice profile
Using external HDD case to give me a nice profile

Again using Chris’s inspiration, I also took the time to engineer a “lip” in the edge of the rubber foot, to allow a better way for the air that has been forced through the PSU, which by now, is rather warmer than what it was on the way into the TC, to exit the chassis. I found that the edge of an external 2 1/2″ HDD chassis was ideal to get a shape for this relatively delicate work.

Resultant profile
Resultant profile
Give the TC room to expel it's warm air.
Give the TC room to expel it’s warm air.

To be honest, at this point in the “fix”, considering I generally like Apple gear, I am feeling rather disappointed in the “engineering” designs for this product. It doesn’t take a genius to look at this design, and go, “hang on a minute…” (or in the immortal words of Michael Caine, ” ‘ang on lads, I’ve got an idea……” Here I am, with a product, albeit a lot longer past the device warranty status than an awful lot of people got, some had TCs that died within 18 months – if not sooner – mine though, did last best part of 3 years before expiring, and what I am I doing to this product ? Yes – effectively butchering it to turn it into the device that it should have been when it was first purchased – my spin on that is that it’s a total failure at both design, testing, and QA. I bought my MacBook Late 2008, and therefore, I *think* I bought the TC early 2009, and it expired around December 2012, but with everything that life throughs at one – it’s taken me till November 2013 to get around to fixing it ;-(

 

Grill, cut and ready for gluing into place.
Grill, cut and ready for gluing into place.

Nicely snipped the grill to fit around the fan mounting pins, and then glued onto the aluminium base plate – no drama there, used some “standardish” gooey stuff – nothing special as I can’t find my special metal loaded epoxy resin glue 🙁 I think that  it’s in the garage “somewhere” – probably either amongst the spare bits of my nitro-methane powered remote control car, or remote helicopter, or some other crap. Next, again, as with Chris’ beautiful details, I turned the fan upside down, shortened the rubber mounting pins, and re-attached the fan to the base plate. Now, this is where I got intrigued about the fan speed control.

 

 

Fan upside down now.
Fan upside down now.

I mean, the TC has this little wire running from the board to the hard drive, and, if you look at the semiconductor attached to the end of the wire, you find, not what I would immediately expect, a thermistor, but a transistor, a 2N3904, with the base and collector connected electrically together, and, if you use a little bit of brute force to heat this transistor up, (I used a lighter I had kicking around on my desk) you can get the fan to turn on – and turn on it does, not just a little gentle blow, but a full on jet engine, and the TC then whinges in the AirPort application that it’s overheating – bloody hell Apple – either one or the other, and neither particularly “good”. For the moment, however, I need to use my TC in anger, so I’ve disconnected the temperature sensor that is attached to the hard drive of the TC, and bolted the whole lot back together again, and just going to live with the “pain” of the jet engine fan (and whinges of AirPort). Other than the front LED flashing orange at me, and the occasional whinge by my AirPort utility to say that the TC is likely to be overheating, (it isn’t – the fan is running at full chat, and blowing lots of air through the PSU ! Trust me, the whine in my ear tells me so!!!!) the TC now works perfectly.

My first 10k

Well, it’s over….

I wasn’t expecting a hugely fast time, after all, it was my first race for any reason, except cross country at school, which is an ever faster rapidly fading distant memory.

This event, was part of the Blenheim Half Marathon in aid of the British Heart Foundation, which also has a 10k event and a 2k Family fun run course.

I was “talked” into doing the 10k by my work running buddies, and the furthest I have run in recent times, (and by that, I am talking at least 10 years) is 5 miles, or 8k, so a 10k, whilst not *that* much further, when one is used to doing 5.3k, is quite a jump.

I started ok, and for the first mile, things were a learning experience, as hadn’t run in a body of people like that before; it takes some getting used to because it’s so different than either running on you own or with a a couple of running buddies, it’s so close, and some people are obviously quicker than you and will try to go round you, and, conversely, you are quicker than a whole host of others, which you try to go round as well.

The second and third miles we’re really ok, just keeping the pace ticking over, and at a slightly reduced pace than I am used to training at, I’m normally doing around 9 minute miles training at the moment (which is a hell of a lot better than 10:30’s when I started back again seriously doing some mileage back in June) , but for this race I was doing ok at about 10:00’s, which made the magic hour for the 10k quite possible – just.

Mile 4 was hard work, and a fair bit slower, but is was mile 5 that really hurt me.
Around the 4.5 mile marker, I felt me right knee go a bit funny, and I knew then that ITBS was a potential for coming back, but there was no way I was stopping now, so I eased off a bit.

This did help until one of my running buddies caught up with me, and we kept going together for the best part of the next mile, however at about 5.6 miles my knee just stopped functioning properly, and I was forced to a walk, which was more like a limp.

I could see a corner, and knew that the end of the race was the relatively short uphill to the finish line, just round that corner, so I said bugger it, and put in a final push to home and over the line, and the very nice people lining the finish straight, including the very special Cath, were cheering us all as we ran down this last little bit, I even got a mention over the PA system to help me over those last few metres.

My official time was 1:05:38, my iPhone time was 1:05:41, so a good correlation there 🙂 and although it was outside the hour mark that I was really hoping for, even a 59:59 would have made my day, I guess a first attempt, especially in the light of an injury sustained whilst doing the race, it’s pretty darn good, so I suppose I can’t complain too loudly.

So to the injury.

Well, the 5miles/8k runs I had done were in the 3 weeks preceding the 10k race, as training runs, and had got on “ok” with them, no issues or injuries, and only the normal “I have used my muscles ache” the following morning, which is, as far as I am concerned, a good thing.

However, in the last month and a half, I had done a 4 mile course a couple of times, and on the second occasion, had run into problems with pain in the outside of my knee, with an anatomical part known as the iliotibial band, a problem known as ITBS.

It’s rather painful, and stairs make it quite a challenge, however, it is rather mild, so I won’t complain, but I will now start to look at fixing myself to make sure I don’t have issues with this piece of body next time I do a 10k.

Yup, you read it right, I am going to do another 10k, but I am going to go quicker, and I am not going to break doing it, as I will be fitter and stronger when I do it.

I’ve already done some reading about the causes, and rehabilitation, and training to Fix, and prevent ITBS, and, after a day or two of rest, I’m bloody well going to start on those training things to get me ready for the next one.

Don’t know when I will do it mind, but, definitely got the race bug now….

Ads…….

Well, it’s kinda annoying in a way, but I do really need to look at trying to re-coup some cost of running this site, and other sites and things I do, so, apologies, kinda, of putting some ads on the site.

However, the plus side is that there may well be ads that have some interest to you – so all in all, that isn’t going to be a bad thing overall, you might discover something you didn’t know about before, and, as I always have said  – keep an open mind 🙂 .

I’ve decided to use Google Adsense…. why ?

Well, with this blog software it makes my life relatively easy, I don’t have a huge amount of time at the moment, so, I am going for some simplicity.

Whilst I work to get things how I want them, aesthetically, there will be some unpleasantness on the site, Sorry about that.

I should have a sign, “Men at work”  🙂

I’ve entered a 10k race……. (I must be mad)

For those of you who know me personally, I’ve been through periods in life in the not too distant past where it is fair to say I have been somewhat on the large side.

Well, stepping onto those scales sometime ago, and getting (almost) a heart attack at the size of the number peering back at me from the floor, I decided to do something about it.

First of all, lots of low impact exercise like cycling, watch the beer (I know – that is shite) watch what I eat, and gym membership and the like..

Well, good news, it’s worked.

I’m a hell of a lot lighter than I was, although I will admit I am not *there* yet, but definitely on a curve to my target.

Well, I’ve now been running on and off for a few months, although breaking my big toe in January (Taekwondo) and the ensuing uncomfortablness in running until May, kinda put paid to as much training as I would have liked.

However, Since May I have started by hauling my somewhat lighter arse round a little route at work, on and off with a couple of workmates, and pushed things from 2 miles out to a 5 miler, which I don’t think is too bad.

Well, now, these running buddies of mine have suggested I take part in a 10k race……

….. So I am..

If anyone reads this, then some sponsorship wouldn’t go amiss.

I am running for the British Heart Foundation, the BHF, they are the prime organisers (like the London Brighton Bike ride) and I guess everybody has a heart, so, with that, here is my link for sponsorship….

http://www.justgiving.com/KieranNReynolds

Or, you can donate by text, by texting PHCH70 £2 to 70070.

Cheers all !

 

Is it raining in the cloud ?

I’m in the cloud, and I am getting cold and wet, it’s raining in my cloud.

Cloud services are supposed to be the saviour for corporations, and, I dare say there are a lot of really good real world examples where this is the case.

Take companies where they need extra processing power, as and when they need it, such as AWS,(Amazon Web Services) or Google Apps, or even, (for me – shock and horror) something like Office 365, which is all well and good.

However, let’s take me as a not so good example…

I would say that I’ve never had a normal setup for some regular Joe at home, granted, but, lets look at what I need.

Backup….. Well, sorta, it’s more like making sure I have multiple copies, but those multiple copies get synchronised across multiple devices, my MacBook, my Linux laptop, my iPad, and my iPhone., and maybe even a Windows machine if need something desperately to use Windows.

In short, I need a cleaner way of keeping all files on the machines up to date, so I’ve looked at Dropbox, Google drive, and even Apple cloud services.

So, lets look at the figures.

Dropbox, 500GB is $49.99 per month, if you pay in advance for a year.

Apple’s cloud services top out at 55GB, which isn’t even enough to back up the entire contents of my 64GB iPad – eh , what ????

Lets look at Google Drive, that is $49.99 for 1TB.

Question I guess is how much data I have.

Ok – when I first started to write this entry, the Documents directory on the laptop that I was writing this entry on was 3.6GB, so within the “free” range of all the available options, but as I have consolidated all my documents across a number of devices, that is now 5.8GB, which is now over that 5GB of free space.

However, this 5.8GB doens’t include my photographs – those, they take a little more than that – like over 100x as much at currently 450GB(ish) – and that includes all the RAW as I shoot in RAW format (and used to shoot in RAW+JPG)

That 450GB is a lot of data, and therefore well exceeds the Apple cloud offering by some considerable margin, but comes in the 500GB bracket for Dropbox and Google.

Now, when I go on a shoot, say in Wales or the Lake District, or say an air show, then I can easily burn through 24GB a day in photographs.
Ok, some of these are rubbish, that I really should get rid of, but, historically, I haven’t.

Taking that into consideration, I will easily burn through Dropbox’s “sane” offering, at least without having the 1TB option as a “team”.

That leaves me Google.

No real problem there.

But – let me look at another option.

I host my own gear, actually on the server that this website runs on.

It sits in a datacentre somewhere in the EU – Germany to be precise, (not that I am giving anything away here – it’d be trivial to lookup the IP address of this site and work it out)

Given that I have a bit of space left on the drives that this site lives on, so…. I installed “OwnCloud”.

From Ownclouds own description, it’s effectively an Open Source Dropbox-a-like piece of software that lives on an apache server and can turn that Apache server into a cloud instance, and OwnCloud has clients for Windows, Mac, Linux, iPhone/iPad and Android devices.

So – from the outside appearance, I can run my own cloud data storage services on a device I rent – i.e. this server in Germany.

I’ll admit this server doesn’t have the disk space to hold all my photographs – but I can easily purchase an upgraded machine, migrate all my stuff across, and then it would do, at least for another couple or three years I reckon.

So, for the moment, I have installed this server software, and a client for my Linux laptop, and my MacBook, and even my iPhone.

I have therefore started to syncing up this laptop, and that is where I have hit the biggest single failure in ANY cloud storage service.

Speed.

Non, not speed of the server at the remote end, or even the speed of the local machine running the client.

But, the speed of my link to the Internet.

No matter where I am, away from a corporate environment, like “home”, I have ADSL, and by the name of it, it’s “asynchronous” – so, the speed is faster one way than the other, in the case of ADSL, download from the Internet is the faster, by quite some way.

For example, the link I am on at the moment, is 4.4Mbs/ down and 448kbps up.

So, if I want to download an iso image for a new OS that I am looking at, or even the new Adobe Photoshop elements, that means I can download the files at 4.4Mbs/8bits = 550kB/s.

That means that a 1MB file will download in a shade under 2 seconds, but a 4GB DVD iso will take about 2 ½ hours.

That’s not too unreasonable as a general thing, I don’t need huge files often.

But, let me translate that to my problem.

My Documents directory will take around 2 hours to download.

However, that would be the end of the issue if I already had those files on the server, which at the moment I don’t.

I need to upload them……

…. at 448kb/s – which is a pathetic 55kB/s – that will take what ~25 hours ….

A DAY!!!!!

There may be some companies where they wouldn’t mind me going into work, plugging my personal laptop into the network during lunch say, or otherwise out of hours, and throw my ~4GB to any cloud storage service, without having a complete hissy fit either from a security perspective, or a bandwidth perspective.

However – these companies are likely to be the exception rather than the rule, so I’m stuck, personally, with any options for cloud storage.

I sure ain’t going to start talking about the whole Bring Your Own Device argument either, not here, not yet – it’s not the place, or crucially, the time.

The cloud works wonderfully within a corporate or educational/research environment, for say sharing documents, with colleagues in London, Leeds or even Los Angeles, as businesses and educational/research establishments will have fast, synchronous connections to the Internet, and in some instances, (like where I work) actually be part of the Internet themselves.

However, back to poor old me, and my requirement to use the cloud.

It will take me ~25hours to “seed” my cloud instance from this one machine, and this little laptop is a machine I don’t tend to have “much” data on – and it only includes my Documents, and nothing like my Music, that’s ~12GB, and of course it doesn’t include my 450GB of photos.

And therein lies the problem with the cloud – it’s not the cloud where it’s raining, it’s under the cloud, me, and you, at home, wanting to upload our data to the cloud.

Great when the data is already there, like Google Apps, but getting it up into the cloud, like my documents and photographs, is a giant pain in the back side due to the lack of speed on the network that has to be used to upload it.

Now, be clear, I am not going to blame this on the cloud, the cloud itself isn’t the problem here, not by a long shot, but on broadband, and the nature of the beast, a beast that is now severely flawed.

ADSL started life when all that one really needed to do was to download data from the Internet, and since about 1993 that is all most people have ever done.

However, the cloud is changing the game, for the better in a lot of ways, trouble is, the networks that were built to service the Internet for consumers historically, are now woefully inadequate to be able to keep up with the demand for data flow, and are essentially not fit for the new purpose of this, the “new” Internet and the new world order of “cloud” – certainly not cloud storage anyway, cloud applications, that is a different matter – that is all about sending data from the Internet to the clients.

One thing to look forward to is fibre connectivity.

If you can get it.

I couldn’t – well, at least not until June this year, as BT appeared to have screwed up with the rollout of FTTC – otherwise known as Fibre To The Cabinet, in my area – they missed a cabinet out in my, the newest of estates in the town I live in.

There are people in outer lying areas that have this capability – but no, not me, or anyone in my estate, and it’s taken BT a very long time to undo their screwup of missing out upgrading this one cabinet.

30 June 2013 is the date I might have been able to get Fibre based broadband to home, and yes, this will massively improve this ability to upload my files to the cloud storage provider of my choice.

Even then though, it will only be ~15Mbps – 1.8MB/s, which admittedly, will still mean it will take me a long time to upload my photographs – shade under 3 days, but under ½ an hour to upload my documents, a distinct improvement.

Like real clouds, the Internet Cloud, it’s great when you are in it, like I am at work, but for me, at least home-wise at the moment, whilst it isn’t raining in the cloud, I definitely am under a cloud, and I feel a little cold and damp as a result.

Facepalm – sudoers not working

DOH!!!!!

Always remember to check the command you are asking to be run as sudo…

Here I was the other day, setting up nagios nrpe agents on a new server at work, and I kept getting “NRPE Unknown” errors in the Nagios console.

Normally, this is down to the user who is running the command on the client server not having the permissions to execute the check_xxxx command, and, as these commands are potentially sensitive commands, they are run as a sudo command without a password.

On Debian, there is a lovely option to specify a secondary sudoers file under /etc/sudoers.d/sudoers, which allows an identical file to be copied to all servers, yet still have the main sudoers file to be unique in /etc/sudoers.

Now, I had copied this /etc/sudoers.d/sudoers file from another server for which I knew Nagios nrpe services were working…. Yet inexplicably, I was getting problems in the Nagios console, “NRPE Error – unknown” – which I know is NRPE not being able to run the remote command, 99% of the time a sudo error.

The last time I saw this I had screwed up the sudoers file, such that the nagios user didn’t have the permissions to run the commands with “NOPASSWD”.

In this case, that wasn’t the problem, after all I had copied the sudoers file from a working machine … Right?

Well, yes.
So that wasn’t the problem.

Log onto the client, su – from root to the nagios user, and of course, there is /bin/false for the nagios user – one small change later, at least for testing, the nagios user has a shell.

Try again, su – nagios, and I get a shell and a prompt, ok, good start.

Now, try /sbin/sudo /usr/lib/nagios/plugins/check_ntp, and what happens?
I get promoted to enter the nagios user’s password.

Hmmmm – last I looked I hadn’t even set one.

Exit nagios user, become root again, set (crap) password for user, su – to nagios user and try again, and even enter the (crap) password…

Same bloody error.

WTF!?!?!

Am I really going mad? (Ok don’t answer that one)
Ok, silly thought…

ls -al /usr/lib/nagios/plugins/check_ntp
/bin/bash: command not found.

Aaaaaaaaaah.

ls -al /usr/lib/nagios/plugins
.
..

Ooooooops… No files.

No files exist to run, let alone run via sudo.

Lesson to be learnt, sudo protects you from your own stupidity, if the command doesn’t exist, sudo will give you an error.

Second lesson to be learnt, make sure the commands you want to run as sudo actually exist!!!!

Definitely a facepalm day.

Upgrading Drupal on Oracle (again)

drush vset maintenance_mode 0

Doesn’t bloody work with an Oracle install. Typical.

Well – today I don’t have the time nor the inclination to get to investigate, so, went in via the GUI, and set to maintenance mode.

Then I got on with the command line options.

drush dl
chown -R user:web-user drupal-7.xy
cd drupal-7.xy
rm -rf ./sites
cd ../
cp -pR html/sites/ ./drupal-7.xy/
chmod 000 drupal-7.xy/*.txt
chmod 644 drupal-7.xy/robots.txt

 

 

Yip-de-deee.

So – step one done, downloaded and copied most of the things as required.
Now need to patch the update.in file as I detailed here.

cp html/includes/update.inc.patch drupal-7.xy/includes/
cd drupal-7.xy/includes/
cp update.inc update.inc.bak
patch --dry-run -i update.inc.patch
patch -i update.inc.patch
diff update.inc.bak update.inc
chown user update.inc

Phew – now done the patch.

One last thing – copy the Oracle Database driver to the new area.

cp -pR html/includes/database/oracle ./drupal-7.xy/includes/database/

Now move the new directory into place, and run the update.php

mv html/ html-old
mv drupal-7.xy/ html/

Browser to to http://www.site.name/update.php

All done..

Use Gui to turn off maintenance mode.

Clear cache from CLI

cd html/
drush cc all

Yippeeee….

Reckon I should (could) script that now…..