Coffee shop relaxation

This is the result of what I was going to blog when I was relaxing and having a coffee and sarnie, by the way of a very late lunch after a morning TKD session. (that incidentally was a sweat box, but very much fun 😀 )

Well, I thought I’d attempt the classic techie shopping and lunch trip – and by the by, have a nice coffee in a random coffee shop and do some surfing.

So, after getting one or two little items, which was a little less than I had thought, or wanted, but one store didn’t have what I was looking for, I thought I would sit down in MK’s Starbucks or Costa, and enjoy some gentle caffeine enhanced relaxation.

So – first off, I wanted Internet, you know, catch up on some of the news I’ve missed, maybe even write a blog post about something, simple enough I guess, my MacBook connected to “_The_Cloud” – but then I can’t surf without signing up – or remembering what I signed up with on my iPhone what seems like a billion years ago, and connecting and resetting my password would mean I needed access to email – which requires Internet access…. which.. yeah, ok see the problem ? (Astute readers will say why didn’t I read the email off my phone ? – well…… )

OK then, lets try my iPhone itself, and tether.

4 attempts at connecting, ok, I can’t whinge too loudly at that, I haven’t connected this MacBook to my iPhone (did with my old MacBook)

A tiny glimpse of Internet via my Galaxy SIII, so I kick off an update of a plugin on a website of min, but…… then nothing.

So that is Problem 2 – Vodafone, whilst saying I have 4 bars, and 3G signal (this being an iPhone 4S – no 4G), I have now lost the Internet. (so, back to problem about getting email….)

OK – try again, with my Samsumg Galaxy SIII mini, and although this connected first time (but, I have used that phone on this MacBook whilst I was on holiday last week), but, again, although the service provider is saying decent signal – I’ve got no Internet.

And the biggest pain in the arse of all – I didn’t charge any of my devices last night, so whilst I am now currently writing this on a MacBoook Pro in Pages instead of online on my blog, I’m doing it with only 11% of battery 🙁

OK – it reckons 0:39 hours left, which should mean I make it to the end of my steak and cheese sarnie and coffee, but, pretty much bugger all else.

So I’ve dropped the backlit keyboard off, and dropped the screen brightness down to a point where it’s dark enough to almost think the text is dark grey on a slightly lighter dark grey – and the battery life has now dropped to 0:35 remaining.

And the update of a plugin on my site, still hasn’t finished –  the connection I have, tenuous at best, hasn’t been fast enough to update the page quick enough before I was careless, and lost the Internet.

So what I have done with this blog entry, is write it in Pages, so that I can get my thoughts down, and will upload when I get home, to something approaching a broadband connection.

Question is, will I get to the end of writing the blog entry before the battery on the MackBook dies ?

Chances are actually pretty good, it’s a decent bit of kit to be fair, but technology, batteries especially, don’t last forever.

Now my iPhone is at 15%, and GSIII is at 21%, and this MacBook is 7%, I’ve finished my coffee, and more than ready to go home.

So, technology makes things easier ?

Hmmmm, today, not so convinced, although I did want a nice relaxing time in a coffee shop, I got one, just not in the way I had envisaged.

I would have been better off reading a book, but my Kindle isn’t charged……

Apple Time Capsule Repair (Part 1)

Yes, of course, I am talking about an Apple Time Capsule, which, as we all know, are a little too prone to expiring a little young. Well, I am an owner of one of these afflicted devices, the downside is that I used it both as a backup device, and as a wireless access point, so, in one go, I’ve lost two useful devices. Well, as my alter ego MacGyver has a knack of resurrecting dead hardware and technology, I figured I would try some to work some Lazarus style resurrection on my Time Capsule. There are a number of sites on the Internet that deal with the results of the failure, and have a number of different ways of resurrecting the device, from the sensible, to the frankly insane (external wall-wart with a butchered power socket). After a bit of research I have found that two sites give a broadly similar method of repair,  that of Chris Fackrell, and Lapastenague, and therefore embarked on my journey. First off, when I came to dismantling the TC, I admit I was a little impatient, primarily because I was rather annoyed about the device failing, and therefore I wasn’t taking as much care as I possible could have done, and during the process of removing the rubber foot on the device, I tore the rubber 🙁   Overall this doesn’t make a huge difference in the finished repaired article, but, it’s just a known annoyance from my perspective, I will live with it.

Capacitor Plague
Capacitor Plague

So, rubber foot removed, and screws pulled out from the bottom, and everything now nicely exposed, I got to what everyone was saying was the root cause – a fried power supply. They were right. I had fried capacitors on the PSU module. JOY. It’s not the first time I have had a device with fried capacitors, if anyone does a Google search for “capacitor plague” then you will find out the extent of the problem in recent years, and some potential theories on the root cause. Frankly, I don’t give a hoot about the precise root cause, unless it’s exacerbated by some pretty piss-poor design, and in the case of the Apple TC, that is precisely what we have. The TC is a device that has 3 modules, a PSU, a hard drive, and a system board, and all three modules generate heat by due nature of having electricity running through them.

The hard drive, in my case, a Western Digital Black Series, is designed for high performance devices, with barely a secondary care for heat generation. It’s the sort of hard drive designed for those people that like to have oversized desktop cases with lots of colourful fans inside to keep the PC inside cool. (Hey – I was one of that crowd at one point in my life – so I won’t knock ’em) Let’s put it another way, the drive definitely isn’t a Western Digital Green Series, designed for low power and low heat generation, the kind of drive required for such a little device such as the Apple TC. Another issue, where is the cooling for the TC ? Apple, in their infinite wisdom included  a fan inside the TC, but there are a couple of serious flaws with that in itself.

Firstly, and I would say, foremost, a fan is designed to move air around – but in the case of the TC, where is it going to get air from ? No-where, the aforementioned rubber foot on the device covers all the holes in the bottom of the case, and, the holes in the bottom of the case are so few and far between (and so small) that the are essentially a decorative option in the aluminium panel at best (if the foot wasn’t covering them!).

The second problem – it “blows” air directly at the hard drive – the side of the hard drive, at the end furthest away from the power connectors – and there is, at best, a 2mm gap between the fan and the hard drive – so no where for the “air” to go to go over the surface of the hard drive, which would make the fan somewhat more useful.

And thirdly – I’ve never heard the fan come on…….. ….ever…. *UNTIL* I pulled the cable off the heat sensor that is attached to the hard drive, then the TC sounds like a muffled jet as the fan comes on at full chat, and the lovely green LED on the front goes orange, and the TC warns you via AirPort utility that the TC might be overheating. (more about that later – at the moment, I have a dead TC) Bollocks, and definitely no dogs involved. So, what am I going to do to get this thing working again ?

Well, following some of Chris Fackrell’s instructions, I start to look at getting the casing off the PSU module, whereupon, the thin black plastic takes on the structural consistency of a Cadbury’s flake, so getting something resembling a secure and safe “casing” to the PSU module back together post repair was nigh-on impossible, and unlike Chris, I didn’t fancy hand cutting some special sheeting to fit around the module.

So – I took a chance, a quick scoot round a famous auction site (yes, ok then, eBay), I found  a TC PSU in USA, so quickly entered my PayPal details, and somewhere around the 2 week mark, I had a nice little package waiting for my when I got home from work. First thing I did, was to assemble the PSU module in the TC, and fire the thing up without the case on (carefully – as there is a small risk of zapping oneself on mains level voltage if one is unduly careless) – and as this had passed the “not blow up and spit lots of broken electrics and magic smoke all over the place” test, I went the step further, loosely put the case back on, and plugged in an ethernet cable, and fired up my MacBook to see if everything could see each other, and, thankfully they could – at least the motherboard in the TC worked, and I could, at least in theory now, pull any data off the hard drive. Next step, well, as the TC is a flawed design, heat-wise, I started to look at what I could do to alleviate this issue, and both Christ Fackrell and LaPastenague have some fairly detailed method for solving this, I thought that I would have a go at engineering a longer term “fix”.

Drilling hole in aluminium base plate of TC
Drilling hole in aluminium base plate of TC

So, both agree that the fan is useless, so, lets fix that first, well, we do have to start somewhere  – right ? and at least lets start with something that will help prolong the replacement PSU’s life. I engineered a hole, 40mm (ish) in the bottom of the aluminium plate that forms the bottom of the case on the TC. I didn’t drill this with the rubber foot attached, I didn’t know exactly what the big drill bit would do to the rubber itself, didn’t fancy the teeth catching the rubber and ripping it to shreds, and leaving me with a completely destroyed rubber foot – I’d take it with a tear, as honestly, one can’t see it, when the system is in operation, it is on the underside after all – and only I will know it’s there 🙂

 

Rubber foot, Circular hole
Rubber foot, Circular hole

Now, in line with Chris’ details, and LaPastenague’s photo’s regarding the fan holes, they have quite clearly decided to prevent the ingress of (large) foreign bodies into the system by putting a grill over the fan opening, so, one auction site purchase (eBay) later, I had some stainless steel grill material to go over my newly engineered hole in the aluminium plate. At this moment in time, I took the opportunity to temporarily re-attach the rubber foot, and used a Stanley knife blade to cut out the circular hole in the rubber foot using the hole I had just drilled/cut out as the template for the Stanley blade. I also took this opportunity to make another couple of modifications to the rubber foot.

 

Rubber foot, lots os screw holes.
Rubber foot, lots os screw holes.

Firstly, I figured that I was likely to need to pull the whole thing apart again at some point, and,  therefore, I needed an easier way of getting at the insides of the device, and I don’t fancy pulling the rubber foot off the metal base plate again in any hurry – so getting yet more inspiration (to save the perspiration) from Chris’ breakdown, I used a handy leather punch to “engineer” a series of tiny holes in the rubber foot, precisely where the 10 screws are used to hold the base plate onto the TC chassis. Keeping the little rubber circles that you have cut out, also (again per Chris) says, makes the whole thing look tidy and professional (as much as a hacked hole in the bottom of Apple gear looks) when everything is put back together again.

 

Using external HDD case to give me a nice profile
Using external HDD case to give me a nice profile

Again using Chris’s inspiration, I also took the time to engineer a “lip” in the edge of the rubber foot, to allow a better way for the air that has been forced through the PSU, which by now, is rather warmer than what it was on the way into the TC, to exit the chassis. I found that the edge of an external 2 1/2″ HDD chassis was ideal to get a shape for this relatively delicate work.

Resultant profile
Resultant profile
Give the TC room to expel it's warm air.
Give the TC room to expel it’s warm air.

To be honest, at this point in the “fix”, considering I generally like Apple gear, I am feeling rather disappointed in the “engineering” designs for this product. It doesn’t take a genius to look at this design, and go, “hang on a minute…” (or in the immortal words of Michael Caine, ” ‘ang on lads, I’ve got an idea……” Here I am, with a product, albeit a lot longer past the device warranty status than an awful lot of people got, some had TCs that died within 18 months – if not sooner – mine though, did last best part of 3 years before expiring, and what I am I doing to this product ? Yes – effectively butchering it to turn it into the device that it should have been when it was first purchased – my spin on that is that it’s a total failure at both design, testing, and QA. I bought my MacBook Late 2008, and therefore, I *think* I bought the TC early 2009, and it expired around December 2012, but with everything that life throughs at one – it’s taken me till November 2013 to get around to fixing it ;-(

 

Grill, cut and ready for gluing into place.
Grill, cut and ready for gluing into place.

Nicely snipped the grill to fit around the fan mounting pins, and then glued onto the aluminium base plate – no drama there, used some “standardish” gooey stuff – nothing special as I can’t find my special metal loaded epoxy resin glue 🙁 I think that  it’s in the garage “somewhere” – probably either amongst the spare bits of my nitro-methane powered remote control car, or remote helicopter, or some other crap. Next, again, as with Chris’ beautiful details, I turned the fan upside down, shortened the rubber mounting pins, and re-attached the fan to the base plate. Now, this is where I got intrigued about the fan speed control.

 

 

Fan upside down now.
Fan upside down now.

I mean, the TC has this little wire running from the board to the hard drive, and, if you look at the semiconductor attached to the end of the wire, you find, not what I would immediately expect, a thermistor, but a transistor, a 2N3904, with the base and collector connected electrically together, and, if you use a little bit of brute force to heat this transistor up, (I used a lighter I had kicking around on my desk) you can get the fan to turn on – and turn on it does, not just a little gentle blow, but a full on jet engine, and the TC then whinges in the AirPort application that it’s overheating – bloody hell Apple – either one or the other, and neither particularly “good”. For the moment, however, I need to use my TC in anger, so I’ve disconnected the temperature sensor that is attached to the hard drive of the TC, and bolted the whole lot back together again, and just going to live with the “pain” of the jet engine fan (and whinges of AirPort). Other than the front LED flashing orange at me, and the occasional whinge by my AirPort utility to say that the TC is likely to be overheating, (it isn’t – the fan is running at full chat, and blowing lots of air through the PSU ! Trust me, the whine in my ear tells me so!!!!) the TC now works perfectly.

Ads…….

Well, it’s kinda annoying in a way, but I do really need to look at trying to re-coup some cost of running this site, and other sites and things I do, so, apologies, kinda, of putting some ads on the site.

However, the plus side is that there may well be ads that have some interest to you – so all in all, that isn’t going to be a bad thing overall, you might discover something you didn’t know about before, and, as I always have said  – keep an open mind 🙂 .

I’ve decided to use Google Adsense…. why ?

Well, with this blog software it makes my life relatively easy, I don’t have a huge amount of time at the moment, so, I am going for some simplicity.

Whilst I work to get things how I want them, aesthetically, there will be some unpleasantness on the site, Sorry about that.

I should have a sign, “Men at work”  🙂

Is it raining in the cloud ?

I’m in the cloud, and I am getting cold and wet, it’s raining in my cloud.

Cloud services are supposed to be the saviour for corporations, and, I dare say there are a lot of really good real world examples where this is the case.

Take companies where they need extra processing power, as and when they need it, such as AWS,(Amazon Web Services) or Google Apps, or even, (for me – shock and horror) something like Office 365, which is all well and good.

However, let’s take me as a not so good example…

I would say that I’ve never had a normal setup for some regular Joe at home, granted, but, lets look at what I need.

Backup….. Well, sorta, it’s more like making sure I have multiple copies, but those multiple copies get synchronised across multiple devices, my MacBook, my Linux laptop, my iPad, and my iPhone., and maybe even a Windows machine if need something desperately to use Windows.

In short, I need a cleaner way of keeping all files on the machines up to date, so I’ve looked at Dropbox, Google drive, and even Apple cloud services.

So, lets look at the figures.

Dropbox, 500GB is $49.99 per month, if you pay in advance for a year.

Apple’s cloud services top out at 55GB, which isn’t even enough to back up the entire contents of my 64GB iPad – eh , what ????

Lets look at Google Drive, that is $49.99 for 1TB.

Question I guess is how much data I have.

Ok – when I first started to write this entry, the Documents directory on the laptop that I was writing this entry on was 3.6GB, so within the “free” range of all the available options, but as I have consolidated all my documents across a number of devices, that is now 5.8GB, which is now over that 5GB of free space.

However, this 5.8GB doens’t include my photographs – those, they take a little more than that – like over 100x as much at currently 450GB(ish) – and that includes all the RAW as I shoot in RAW format (and used to shoot in RAW+JPG)

That 450GB is a lot of data, and therefore well exceeds the Apple cloud offering by some considerable margin, but comes in the 500GB bracket for Dropbox and Google.

Now, when I go on a shoot, say in Wales or the Lake District, or say an air show, then I can easily burn through 24GB a day in photographs.
Ok, some of these are rubbish, that I really should get rid of, but, historically, I haven’t.

Taking that into consideration, I will easily burn through Dropbox’s “sane” offering, at least without having the 1TB option as a “team”.

That leaves me Google.

No real problem there.

But – let me look at another option.

I host my own gear, actually on the server that this website runs on.

It sits in a datacentre somewhere in the EU – Germany to be precise, (not that I am giving anything away here – it’d be trivial to lookup the IP address of this site and work it out)

Given that I have a bit of space left on the drives that this site lives on, so…. I installed “OwnCloud”.

From Ownclouds own description, it’s effectively an Open Source Dropbox-a-like piece of software that lives on an apache server and can turn that Apache server into a cloud instance, and OwnCloud has clients for Windows, Mac, Linux, iPhone/iPad and Android devices.

So – from the outside appearance, I can run my own cloud data storage services on a device I rent – i.e. this server in Germany.

I’ll admit this server doesn’t have the disk space to hold all my photographs – but I can easily purchase an upgraded machine, migrate all my stuff across, and then it would do, at least for another couple or three years I reckon.

So, for the moment, I have installed this server software, and a client for my Linux laptop, and my MacBook, and even my iPhone.

I have therefore started to syncing up this laptop, and that is where I have hit the biggest single failure in ANY cloud storage service.

Speed.

Non, not speed of the server at the remote end, or even the speed of the local machine running the client.

But, the speed of my link to the Internet.

No matter where I am, away from a corporate environment, like “home”, I have ADSL, and by the name of it, it’s “asynchronous” – so, the speed is faster one way than the other, in the case of ADSL, download from the Internet is the faster, by quite some way.

For example, the link I am on at the moment, is 4.4Mbs/ down and 448kbps up.

So, if I want to download an iso image for a new OS that I am looking at, or even the new Adobe Photoshop elements, that means I can download the files at 4.4Mbs/8bits = 550kB/s.

That means that a 1MB file will download in a shade under 2 seconds, but a 4GB DVD iso will take about 2 ½ hours.

That’s not too unreasonable as a general thing, I don’t need huge files often.

But, let me translate that to my problem.

My Documents directory will take around 2 hours to download.

However, that would be the end of the issue if I already had those files on the server, which at the moment I don’t.

I need to upload them……

…. at 448kb/s – which is a pathetic 55kB/s – that will take what ~25 hours ….

A DAY!!!!!

There may be some companies where they wouldn’t mind me going into work, plugging my personal laptop into the network during lunch say, or otherwise out of hours, and throw my ~4GB to any cloud storage service, without having a complete hissy fit either from a security perspective, or a bandwidth perspective.

However – these companies are likely to be the exception rather than the rule, so I’m stuck, personally, with any options for cloud storage.

I sure ain’t going to start talking about the whole Bring Your Own Device argument either, not here, not yet – it’s not the place, or crucially, the time.

The cloud works wonderfully within a corporate or educational/research environment, for say sharing documents, with colleagues in London, Leeds or even Los Angeles, as businesses and educational/research establishments will have fast, synchronous connections to the Internet, and in some instances, (like where I work) actually be part of the Internet themselves.

However, back to poor old me, and my requirement to use the cloud.

It will take me ~25hours to “seed” my cloud instance from this one machine, and this little laptop is a machine I don’t tend to have “much” data on – and it only includes my Documents, and nothing like my Music, that’s ~12GB, and of course it doesn’t include my 450GB of photos.

And therein lies the problem with the cloud – it’s not the cloud where it’s raining, it’s under the cloud, me, and you, at home, wanting to upload our data to the cloud.

Great when the data is already there, like Google Apps, but getting it up into the cloud, like my documents and photographs, is a giant pain in the back side due to the lack of speed on the network that has to be used to upload it.

Now, be clear, I am not going to blame this on the cloud, the cloud itself isn’t the problem here, not by a long shot, but on broadband, and the nature of the beast, a beast that is now severely flawed.

ADSL started life when all that one really needed to do was to download data from the Internet, and since about 1993 that is all most people have ever done.

However, the cloud is changing the game, for the better in a lot of ways, trouble is, the networks that were built to service the Internet for consumers historically, are now woefully inadequate to be able to keep up with the demand for data flow, and are essentially not fit for the new purpose of this, the “new” Internet and the new world order of “cloud” – certainly not cloud storage anyway, cloud applications, that is a different matter – that is all about sending data from the Internet to the clients.

One thing to look forward to is fibre connectivity.

If you can get it.

I couldn’t – well, at least not until June this year, as BT appeared to have screwed up with the rollout of FTTC – otherwise known as Fibre To The Cabinet, in my area – they missed a cabinet out in my, the newest of estates in the town I live in.

There are people in outer lying areas that have this capability – but no, not me, or anyone in my estate, and it’s taken BT a very long time to undo their screwup of missing out upgrading this one cabinet.

30 June 2013 is the date I might have been able to get Fibre based broadband to home, and yes, this will massively improve this ability to upload my files to the cloud storage provider of my choice.

Even then though, it will only be ~15Mbps – 1.8MB/s, which admittedly, will still mean it will take me a long time to upload my photographs – shade under 3 days, but under ½ an hour to upload my documents, a distinct improvement.

Like real clouds, the Internet Cloud, it’s great when you are in it, like I am at work, but for me, at least home-wise at the moment, whilst it isn’t raining in the cloud, I definitely am under a cloud, and I feel a little cold and damp as a result.

Facepalm – sudoers not working

DOH!!!!!

Always remember to check the command you are asking to be run as sudo…

Here I was the other day, setting up nagios nrpe agents on a new server at work, and I kept getting “NRPE Unknown” errors in the Nagios console.

Normally, this is down to the user who is running the command on the client server not having the permissions to execute the check_xxxx command, and, as these commands are potentially sensitive commands, they are run as a sudo command without a password.

On Debian, there is a lovely option to specify a secondary sudoers file under /etc/sudoers.d/sudoers, which allows an identical file to be copied to all servers, yet still have the main sudoers file to be unique in /etc/sudoers.

Now, I had copied this /etc/sudoers.d/sudoers file from another server for which I knew Nagios nrpe services were working…. Yet inexplicably, I was getting problems in the Nagios console, “NRPE Error – unknown” – which I know is NRPE not being able to run the remote command, 99% of the time a sudo error.

The last time I saw this I had screwed up the sudoers file, such that the nagios user didn’t have the permissions to run the commands with “NOPASSWD”.

In this case, that wasn’t the problem, after all I had copied the sudoers file from a working machine … Right?

Well, yes.
So that wasn’t the problem.

Log onto the client, su – from root to the nagios user, and of course, there is /bin/false for the nagios user – one small change later, at least for testing, the nagios user has a shell.

Try again, su – nagios, and I get a shell and a prompt, ok, good start.

Now, try /sbin/sudo /usr/lib/nagios/plugins/check_ntp, and what happens?
I get promoted to enter the nagios user’s password.

Hmmmm – last I looked I hadn’t even set one.

Exit nagios user, become root again, set (crap) password for user, su – to nagios user and try again, and even enter the (crap) password…

Same bloody error.

WTF!?!?!

Am I really going mad? (Ok don’t answer that one)
Ok, silly thought…

ls -al /usr/lib/nagios/plugins/check_ntp
/bin/bash: command not found.

Aaaaaaaaaah.

ls -al /usr/lib/nagios/plugins
.
..

Ooooooops… No files.

No files exist to run, let alone run via sudo.

Lesson to be learnt, sudo protects you from your own stupidity, if the command doesn’t exist, sudo will give you an error.

Second lesson to be learnt, make sure the commands you want to run as sudo actually exist!!!!

Definitely a facepalm day.

Upgrading Drupal on Oracle (again)

drush vset maintenance_mode 0

Doesn’t bloody work with an Oracle install. Typical.

Well – today I don’t have the time nor the inclination to get to investigate, so, went in via the GUI, and set to maintenance mode.

Then I got on with the command line options.

drush dl
chown -R user:web-user drupal-7.xy
cd drupal-7.xy
rm -rf ./sites
cd ../
cp -pR html/sites/ ./drupal-7.xy/
chmod 000 drupal-7.xy/*.txt
chmod 644 drupal-7.xy/robots.txt

 

 

Yip-de-deee.

So – step one done, downloaded and copied most of the things as required.
Now need to patch the update.in file as I detailed here.

cp html/includes/update.inc.patch drupal-7.xy/includes/
cd drupal-7.xy/includes/
cp update.inc update.inc.bak
patch --dry-run -i update.inc.patch
patch -i update.inc.patch
diff update.inc.bak update.inc
chown user update.inc

Phew – now done the patch.

One last thing – copy the Oracle Database driver to the new area.

cp -pR html/includes/database/oracle ./drupal-7.xy/includes/database/

Now move the new directory into place, and run the update.php

mv html/ html-old
mv drupal-7.xy/ html/

Browser to to http://www.site.name/update.php

All done..

Use Gui to turn off maintenance mode.

Clear cache from CLI

cd html/
drush cc all

Yippeeee….

Reckon I should (could) script that now…..

Flush, flush – and flush again.

I am not talking toilets here – although for a short while, I wish I was – it would have been less frustrating.

I work and play in a LAMP environment, for example, this blog and site lives on a LAMP stack, and have recently needed to look at tweaking some performance handles.

A number of things have made me properly swear this week.

memchached, APC, and suexec primarily, along with the caches that exist in the software “running” the sites – like WordPress and Drupal.

Caches are so vitally important for anything in computing, as the direct path for repetitive data from storage to consumption, either via a computer or ultimately a human is simply, even in this era of super fast computing, just too damn slow.

We have files from disk cached in physical RAM done by the Linux kernel to increase the performance by dropping the latency at which frequently accessed files are actually read off the disk.

We have caching on the storage controllers for much the same reason – disk storage is slow ( SSD isn’t that common – and is still relatively slow when comparing access times of CPU cache )

We have caching of snippets of “compiled” code with APC (or similar) because it is computationally more expensive, which means time expensive, to recompile the code for an interpreter (PHP) than it is to store that compiled code in memory.

We use caches of commonly requested SQL queries because requesting the data from the actual database every time is expensive time wise – this is memcached.

We have the application itself (WordPress/Drupal) caching pages, or most if the pages and some calls it makes to save time when that is required.

All of these caches improve the user experience, even if that user is a computer doing automated queries.
We’ve come a long way.

The downside comes when data in one of these caches can trample over the data in another one of these caches, not from a memory perspective, because the areas of memory that each if these caches uses is very definitely separate (if it wasn’t – then there’d be huge opportunity for data trampling and associated corruption) – but from a a re-use perspective.

A page has been requested by a user, the application has done it’s caching, and thus could serve that page again without having to re-execute some parts (or all) of the request.
Great …..

Until we, the systems engineer wants to tweak more performance by adding another level of caching.

And due to the nature if the beast – the application’s cached results try to get executed at the next level in the chain – and end up having problems with, for example “Cannot redeclare class insert MySQL”

Enter the world of hair-pulling and head-table banging.

Enable APC, from the php/apache configuration perspective, first page off the site works, second, the infamous PHP white screen of death, and a nasty message in the apache log.

All worked perfectly on a test environment, but the only difference in the test environments is that there was one application installed on the server, not two or more.

Different site, same server, works properly.
Eh ? go figure!

Now, this other site, was the one that I had configured with APC/memcached first, so were were talking cached prefix keys trampling over each other ?

This thought ends in this from a google search.

Bottom line, yes, the APC module configuration from the first site could easily have been stamping all over the requests from the second (and third) sites.

So – configure the

$conf['cache_prefix']

as appropriate on both sites, and restart apache to clear the APC cache.

The problem was still apparent !!!!!!

Stop/start apache to clear APC cache, first page, fine, second page, no matter what that page, CRASH.

Bugger.
Now what.

Damn – disable it all, and put to the side for the day.
Next day, start from a clean slate.

And that term, clean slate was what got it for me.

Clean, totally, clear everything.
There is a reason there are a lot of instructions in Drupal for “drush cc all” when deploying new code or functionality, or modules etc, it clears all of Drupal’s internal cache, and starts everthing afresh.

So, clear all caches, Drupal, wordpress, restart memcached, restart apache, – then try accessing the website(s).

All perfect!!

Multiple pages, one ofter another work perfectly now.

We also get some interesting statistics with the use of APC, out of the 512MB we assigned, we are using about 380MB of it at the moment, but, by God the sites are faster.

Therefore, the words of wisdom for today – like you do after visiting the toilet – pull that chain, and flush all caches in a system every time you flush one. Flush ’em all.

Drupal and Oracle….. Pain

You may or may not realise it, but my “other” site, a pure testing site, www.macgyver.yi.org runs Drupal on an Oracle database.

I’ve done it to actually to learn Oracle, and give me a site I don’t care (so much) about breaking when I am learning.

Well for a good couple of weeks now, I’ve been getting the “There are new releases available for MacGyver.yi.org” email as there has been a new point version released.

OK – good, I like it that I am getting the emails, fantastic, I know that I need to do the work to do the upgrade.

So, today, as part of the bank holiday relaxation, I decided to do that upgrade (update as Drupal defines it, upgrade is from one major version to the next, update is a point release within the same major version)

Great – using this link, there appear to be some nice instructions.

As far as I was concerned, all worked great until step 6.

To quote:-

Run update.php by visiting http://www.example.com/update.php (replace www.example.com with your domain name). This will update the core database tables.

If you are unable to access update.php do the following:

Open settings.php with a text editor.

Find the line that says:

$update_free_access = FALSE;

Change it into:

$update_free_access = TRUE;

Once the upgrade is done, $update_free_access must be reverted to FALSE.

And running the <http://site/update.php> where I came a cropper:-

A PDO database driver is required!

You need to enable the PDO_ORACLE database driver for PHP 5.2.4 or higher so that Drupal 7 can access the database.

See the system requirements page for more information.

Two things here:

1) Version of PHP stated in the error message is wrong (very wrong – not even major revision close)

2) PDO driver for Oracle ? Eh ? Already have Drupal talking to Oracle, from the installation, so what goes ????

Damn, it’s not a simple as I was hoping, and now I was looking at a little bit of a fight to get this working.

Double damn – I had some more plans for today, other than relaxing 😀

Google has some answers, in the form of this link.

I’m going to copy the instructions here – at least then I have a copy of them.

Please note: the patch you will do uses “11.1” as the version number. It will work with version 11.2 and later (but unless you update the patch, you should continue using “11.1” in the ./configure command).

## Download the following instantclient files from Oracle’s website

http://www.oracle.com/technetwork/topics/linuxx86-64soft-092277.html

NOTE – You will need to have an Oracle account, and accept the terms and conditions.

Once downloaded, and copied to your webserver, unzip them.

unzip instantclient-basic-linux-x86-64-11.2.0.2.0.zip
unzip instantclient-sdk-linux-x86-64-11.2.0.2.0.zip

## Move the files to our install location, /usr/lib/oracle/instantclient

mkdir /usr/lib/oracle
mv instantclient_11_2/ /usr/lib/oracle/instantclient

## Fix some poorly named files and add them to our system’s library index:

cd /usr/lib/oracle/instantclient
ln -s libclntsh.so.* libclntsh.so
ln -s libocci.so.* libocci.so
echo /usr/lib/oracle/instantclient &gt;&gt; /etc/ld.so.conf
ldconfig

## Fix more stupid paths:

mkdir -p include/oracle/11.1/
cd include/oracle/11.1/
ln -s ../../../sdk/include client
cd /usr/lib/oracle/instantclient
mkdir -p lib/oracle/11.1/client
cd lib/oracle/11.1/client
ln -s ../../../../ lib

## Download PDO_OCI

mkdir -p /tmp/pear/download/
cd /tmp/pear/download/
pecl download pdo_oci
tar -xvzf PDO_OCI*.tgz
cd PDO_OCI*

## Patch PDO_OCI since it hasn’t been updated since 2005

# copy the lines below into the file “config.m4.patch”

*** config.m4 2005-09-24 17:23:24.000000000 -0600
--- config.m4 2009-07-07 17:32:14.000000000 -0600
***************
*** 7,12 ****
--- 7,14 ----
if test -s "$PDO_OCI_DIR/orainst/unix.rgs"; then
PDO_OCI_VERSION=`grep '"ocommon"' $PDO_OCI_DIR/orainst/unix.rgs | sed 's/[ ][ ]*/:/g' | cut -d: -f 6 | cut -c 2-4`
test -z "$PDO_OCI_VERSION" &amp;&amp; PDO_OCI_VERSION=7.3
+ elif test -f $PDO_OCI_DIR/lib/libclntsh.$SHLIB_SUFFIX_NAME.11.1; then
+ PDO_OCI_VERSION=11.1
elif test -f $PDO_OCI_DIR/lib/libclntsh.$SHLIB_SUFFIX_NAME.10.1; then
PDO_OCI_VERSION=10.1
elif test -f $PDO_OCI_DIR/lib/libclntsh.$SHLIB_SUFFIX_NAME.9.0; then
***************
*** 119,124 ****
--- 121,129 ----
10.2)
PHP_ADD_LIBRARY(clntsh, 1, PDO_OCI_SHARED_LIBADD)
;;
+ 11.1)
+ PHP_ADD_LIBRARY(clntsh, 1, PDO_OCI_SHARED_LIBADD)
+ ;;
*)
AC_MSG_ERROR(Unsupported Oracle version! $PDO_OCI_VERSION)
;;

## Attempt to compile (this is where you’re probably stuck, make sure you’re in your PDO_OCI folder!)

export ORACLE_HOME=/usr/lib/oracle/instantclient
patch --dry-run -i config.m4.patch
patch -i config.m4.patch
phpize
./configure --with-pdo-oci=instantclient,/usr/lib/oracle/instantclient,11.1

##

If you get an error as follows…

checking for PDO includes… checking for PDO includes…
configure: error: Cannot find php_pdo_driver.h.

Then you may get this fixed by doing…

ln -s /usr/include/php5 /usr/include/php

And you can continue by retrying the configure.

make
make test
make install

## Add extensions to PHP

# Create /etc/php5/apache2/conf.d/pdo_oci.ini

echo "extension=pdo_oci.so" &gt;&gt; /etc/php5/apache2/conf.d/pdo_oci.ini

## restart Apache

/etc/init.d/apache2 restart

Congratulations you made it!

## install Drupal!
Read the INSTALL file in the Drupal oracle module. It must be put in a special place in Drupal’s filesystem!

Now, although this all worked for me, in the sense that a phpinfo page returned an “Enabled” for PDO_OCI, but, crucially still failed on the database “upgrade.php” step from Drupal.

Arggggggggggggggggggghhhhhhhhhh 😡

Thankfully, Google to the rescue again.

http://drupal.org/node/1029080

cd includes/
ls -al
cp update.inc update.inc.bak

So, pulling down the patch (copy/pasting from http://drupal.org/files/1029080-update-database-pdo-rev3_2.patch )

Put this into a file, update.inc.patch

vi update.inc.patch

Or pulling down the file directly to the patch file.

wget -c http://drupal.org/files/1029080-update-database-pdo-rev3_2.patch -O ./update.inc.patch

Now – here I do something different, I edited the file because the file locations were different in my instance- i.e. my file wasn’t in a/includes, and I was running the patch directly from <drupal_install/includes/>

From:

diff --git a/includes/update.inc b/includes/update.inc
index f7c7b66..83fa6e4 100644
--- a/includes/update.inc
+++ b/includes/update.inc

To:

--- update.inc
+++ update.inc

Now to do the patch

patch --dry-run -i update.inc.patch
patch -i update.inc.patch
ls -altr
diff update.inc.bak update.inc

The output of the “diff update.inc.bak update.inc” should match the “update.inc.patch” file.

If it does, everything has gone to plan.

Now re-run <http://site/update.php>; et voila!!! All working.

Wow, what a pain on a Bank Holiday.

I have seen the Light(room)

Well, I blame my friend Cath….
And I am sticking to that.

It is, as always with Cath, in a good way.

The other week, a group of us went to RIAT on the Sunday this year for a good day out, and the opportunity to have some good shooting of fast jets (and not-so fast prop planes, but with the BBMF – awesome planes)

Well, about 24GB of photos later (for me anyway), some of us were sat around munching pizza and reviewing our great (and sometimes not so great, again, more referring to me here) shots.

I fire up mine in Linux using Gnome Photo Viewer, Cath fires up Lightroom, and detail freak fires up Windows 7.

Well, that’s a pretty full complement of alternatives between us.
That, in it’s own right is hilarious, and worthy of a celebration that there is that much choice about, as we have all, in our own way, taken a different path.

However, the one thing that did stand out, at least to my mind, was the power of Lightroom.
Who used that out of the three of us ? Yup, the pro photographer, and yup, of course, on a MacBook.

Opened my eyes I can tell you.

I am used to copying all my files off my CF cards onto an external hard drive.

I then write a script to go through all the files, do a little re-name on them, and then create a new directory, and then move all the RAW files into that new directory.

As I am paranoid, I then back them up to another hard drive, and then sync one of the two external drives to a NAS.

I then go through all the files using Gnome Viewer, and each time I get to a photo I class as “best” – to at least worthy of using, move it to a directory called “Best”.

If those photographs are feeling a little, “missing”, then I will fire up UFRAW, and do a “fix” or, enhancement with that, and maybe use GIMP to do a crop…

I then re-sync all 3 devices, 2 ext drives, and the NAS, so I’ve not lost anything.

Cath showed me, through the gift of Lightroom a whole new, more efficient way of working, just use Lightroom to import the RAW photos, rate them how good they are, and then process them, including tweaking/correcting them for body and/or lenses.

… Oh – and tag them with the date, context, place etc….

And sod the JPG’s from the camera, just take RAW, and process them in Lightroom to JPG if required for my web gallery. (and even print them from Lightroom)

So – guess what happened the following week at work – when my work PC decided to tell me my Adobe Flash was out of date????

Show me a link to a trial download of Lightroom – so, once home, fired up the link again, and dropped myself a copy onto my MacBook.

I’m hooked.
Completely.
… and utterly.

I put it to the test on the photographs I took at RIAT.
Out of the 1800 or so that I took, I rated them, and ended up with 255 shots that I then processed.

Lightroom also then processed them, correcting for my lens (my Sigma 120-300mm f/2.8), then processed them into a web gallery, using flash to create a slideshow, which I uploaded here.

I’ve called it “OLD” as I don’t expect it to be around long, as whilst it was a good test of what Lightroom can do, it doesn’t fit in with the gallery software I use to showcase my photographs, and to be honest, much as that software has been a pain recently, I don’t think I will be replacing it any time soon.

One of the things I will say, is that there is a considerable difference between the JPGs produced by the camera and the JPGs produced by Lightroom.

IMHO, and it is very humble, I am really preferring the output from Lightroom, even with almost no “tweaking”.

Again – let me say that again – I prefer the output from Lightroom.

Is this me looking at the JPGs produced from the camera on one machine against ones on another machine – NO.

Same machine.

And – yeah, I have calibrated the screen.
And yeah, in the same program – in the case of viewing the output, this was using either Safari or Chrome on my MacBook.

The JPGs produced by the camera are too “watery”, it’s the best description of the output I can give.

They are lacking a certain something.

It’s a frustrating position.

It means that on each every shoot I need to process the shots through Lightroom (or equivalent) to get a JPG out for display in a gallery.

It’s a different way of working I guess, but, as I alluded to earlier, I think more efficient.

Well, me being an Open Source software advocate, is there something I can put onto my Linux machine that does the same thing as Lightroom ?
Would it give me the same workflow ?

I had a look around, and found a GPL’d piece of software called Darktable.
I’ve aded the Ubuntu repositories to my 10.04 LTS box (that yeah, is no longer supported, but need a bigger HDD to do a reinstall) installed it, and will report back on it’s functionality as and when.

Whilst I was looking at Darktable tonight, I looked at the status of GIMP and 16bbp editing, something that I personally haven’t worried about, until now.

At the moment, it works, but only apparently in the development versions, nothing stable.
OK – for now, putting off using it for proper manipulation.

There is one other thing that is rather interesting with all my investigation and trial work.

At *NO* point did I even contemplate going down the Windows route and whatever tools I could get on Windows – Lightroom/Photoshop included.

It simply didn’t even enter into my mind as an option, not until I had almost finished writing this entry.

Is that because I haven’t used Windows as a home tool in so long now that it is out of my conscious thought ?

Is it because I don’t know Windows well enough these days to make an informed decision ?

Actually, I don’t know, probably the former, even though I have to use Windows at work, its use as a tool for me, is just irrelevant.

With whatever the outcome of my experiments with Darktable, and the wait for GIMP to do 16/32bbp, I’ve decided it’s high time to finally bite the bullet, and go Lightroom and Photoshop on my MacBook.

I just don’t think that as a photographer, ameteur as I am, I can justify *NOT* going down this route now, not after seeing the power and simplicity that it can give me.

Even though any photo reviewing and processing is going to take some time, like it did in the days of film, anything that can cut it down, and gain me extra time in the “field” shooting, that has to be a blessing.

For the moment, for me, that is Lightroom at a minimum.

Doing it this way does give me a minor headache now though….

OSX doesn’t read my Ext3/JFS formatted disks….

That, however, is a problem I will look at another day.

Thanks Cath, I have a different photographic related problem…. x 🙂
(One I prefer to be honest)