Flush, flush – and flush again.

I am not talking toilets here – although for a short while, I wish I was – it would have been less frustrating.

I work and play in a LAMP environment, for example, this blog and site lives on a LAMP stack, and have recently needed to look at tweaking some performance handles.

A number of things have made me properly swear this week.

memchached, APC, and suexec primarily, along with the caches that exist in the software “running” the sites – like WordPress and Drupal.

Caches are so vitally important for anything in computing, as the direct path for repetitive data from storage to consumption, either via a computer or ultimately a human is simply, even in this era of super fast computing, just too damn slow.

We have files from disk cached in physical RAM done by the Linux kernel to increase the performance by dropping the latency at which frequently accessed files are actually read off the disk.

We have caching on the storage controllers for much the same reason – disk storage is slow ( SSD isn’t that common – and is still relatively slow when comparing access times of CPU cache )

We have caching of snippets of “compiled” code with APC (or similar) because it is computationally more expensive, which means time expensive, to recompile the code for an interpreter (PHP) than it is to store that compiled code in memory.

We use caches of commonly requested SQL queries because requesting the data from the actual database every time is expensive time wise – this is memcached.

We have the application itself (WordPress/Drupal) caching pages, or most if the pages and some calls it makes to save time when that is required.

All of these caches improve the user experience, even if that user is a computer doing automated queries.
We’ve come a long way.

The downside comes when data in one of these caches can trample over the data in another one of these caches, not from a memory perspective, because the areas of memory that each if these caches uses is very definitely separate (if it wasn’t – then there’d be huge opportunity for data trampling and associated corruption) – but from a a re-use perspective.

A page has been requested by a user, the application has done it’s caching, and thus could serve that page again without having to re-execute some parts (or all) of the request.
Great …..

Until we, the systems engineer wants to tweak more performance by adding another level of caching.

And due to the nature if the beast – the application’s cached results try to get executed at the next level in the chain – and end up having problems with, for example “Cannot redeclare class insert MySQL”

Enter the world of hair-pulling and head-table banging.

Enable APC, from the php/apache configuration perspective, first page off the site works, second, the infamous PHP white screen of death, and a nasty message in the apache log.

All worked perfectly on a test environment, but the only difference in the test environments is that there was one application installed on the server, not two or more.

Different site, same server, works properly.
Eh ? go figure!

Now, this other site, was the one that I had configured with APC/memcached first, so were were talking cached prefix keys trampling over each other ?

This thought ends in this from a google search.

Bottom line, yes, the APC module configuration from the first site could easily have been stamping all over the requests from the second (and third) sites.

So – configure the

$conf['cache_prefix']

as appropriate on both sites, and restart apache to clear the APC cache.

The problem was still apparent !!!!!!

Stop/start apache to clear APC cache, first page, fine, second page, no matter what that page, CRASH.

Bugger.
Now what.

Damn – disable it all, and put to the side for the day.
Next day, start from a clean slate.

And that term, clean slate was what got it for me.

Clean, totally, clear everything.
There is a reason there are a lot of instructions in Drupal for “drush cc all” when deploying new code or functionality, or modules etc, it clears all of Drupal’s internal cache, and starts everthing afresh.

So, clear all caches, Drupal, wordpress, restart memcached, restart apache, – then try accessing the website(s).

All perfect!!

Multiple pages, one ofter another work perfectly now.

We also get some interesting statistics with the use of APC, out of the 512MB we assigned, we are using about 380MB of it at the moment, but, by God the sites are faster.

Therefore, the words of wisdom for today – like you do after visiting the toilet – pull that chain, and flush all caches in a system every time you flush one. Flush ’em all.

Drupal and Oracle….. Pain

You may or may not realise it, but my “other” site, a pure testing site, www.macgyver.yi.org runs Drupal on an Oracle database.

I’ve done it to actually to learn Oracle, and give me a site I don’t care (so much) about breaking when I am learning.

Well for a good couple of weeks now, I’ve been getting the “There are new releases available for MacGyver.yi.org” email as there has been a new point version released.

OK – good, I like it that I am getting the emails, fantastic, I know that I need to do the work to do the upgrade.

So, today, as part of the bank holiday relaxation, I decided to do that upgrade (update as Drupal defines it, upgrade is from one major version to the next, update is a point release within the same major version)

Great – using this link, there appear to be some nice instructions.

As far as I was concerned, all worked great until step 6.

To quote:-

Run update.php by visiting http://www.example.com/update.php (replace www.example.com with your domain name). This will update the core database tables.

If you are unable to access update.php do the following:

Open settings.php with a text editor.

Find the line that says:

$update_free_access = FALSE;

Change it into:

$update_free_access = TRUE;

Once the upgrade is done, $update_free_access must be reverted to FALSE.

And running the <http://site/update.php> where I came a cropper:-

A PDO database driver is required!

You need to enable the PDO_ORACLE database driver for PHP 5.2.4 or higher so that Drupal 7 can access the database.

See the system requirements page for more information.

Two things here:

1) Version of PHP stated in the error message is wrong (very wrong – not even major revision close)

2) PDO driver for Oracle ? Eh ? Already have Drupal talking to Oracle, from the installation, so what goes ????

Damn, it’s not a simple as I was hoping, and now I was looking at a little bit of a fight to get this working.

Double damn – I had some more plans for today, other than relaxing 😀

Google has some answers, in the form of this link.

I’m going to copy the instructions here – at least then I have a copy of them.

Please note: the patch you will do uses “11.1” as the version number. It will work with version 11.2 and later (but unless you update the patch, you should continue using “11.1” in the ./configure command).

## Download the following instantclient files from Oracle’s website

http://www.oracle.com/technetwork/topics/linuxx86-64soft-092277.html

NOTE – You will need to have an Oracle account, and accept the terms and conditions.

Once downloaded, and copied to your webserver, unzip them.

unzip instantclient-basic-linux-x86-64-11.2.0.2.0.zip
unzip instantclient-sdk-linux-x86-64-11.2.0.2.0.zip

## Move the files to our install location, /usr/lib/oracle/instantclient

mkdir /usr/lib/oracle
mv instantclient_11_2/ /usr/lib/oracle/instantclient

## Fix some poorly named files and add them to our system’s library index:

cd /usr/lib/oracle/instantclient
ln -s libclntsh.so.* libclntsh.so
ln -s libocci.so.* libocci.so
echo /usr/lib/oracle/instantclient &gt;&gt; /etc/ld.so.conf
ldconfig

## Fix more stupid paths:

mkdir -p include/oracle/11.1/
cd include/oracle/11.1/
ln -s ../../../sdk/include client
cd /usr/lib/oracle/instantclient
mkdir -p lib/oracle/11.1/client
cd lib/oracle/11.1/client
ln -s ../../../../ lib

## Download PDO_OCI

mkdir -p /tmp/pear/download/
cd /tmp/pear/download/
pecl download pdo_oci
tar -xvzf PDO_OCI*.tgz
cd PDO_OCI*

## Patch PDO_OCI since it hasn’t been updated since 2005

# copy the lines below into the file “config.m4.patch”

*** config.m4 2005-09-24 17:23:24.000000000 -0600
--- config.m4 2009-07-07 17:32:14.000000000 -0600
***************
*** 7,12 ****
--- 7,14 ----
if test -s "$PDO_OCI_DIR/orainst/unix.rgs"; then
PDO_OCI_VERSION=`grep '"ocommon"' $PDO_OCI_DIR/orainst/unix.rgs | sed 's/[ ][ ]*/:/g' | cut -d: -f 6 | cut -c 2-4`
test -z "$PDO_OCI_VERSION" &amp;&amp; PDO_OCI_VERSION=7.3
+ elif test -f $PDO_OCI_DIR/lib/libclntsh.$SHLIB_SUFFIX_NAME.11.1; then
+ PDO_OCI_VERSION=11.1
elif test -f $PDO_OCI_DIR/lib/libclntsh.$SHLIB_SUFFIX_NAME.10.1; then
PDO_OCI_VERSION=10.1
elif test -f $PDO_OCI_DIR/lib/libclntsh.$SHLIB_SUFFIX_NAME.9.0; then
***************
*** 119,124 ****
--- 121,129 ----
10.2)
PHP_ADD_LIBRARY(clntsh, 1, PDO_OCI_SHARED_LIBADD)
;;
+ 11.1)
+ PHP_ADD_LIBRARY(clntsh, 1, PDO_OCI_SHARED_LIBADD)
+ ;;
*)
AC_MSG_ERROR(Unsupported Oracle version! $PDO_OCI_VERSION)
;;

## Attempt to compile (this is where you’re probably stuck, make sure you’re in your PDO_OCI folder!)

export ORACLE_HOME=/usr/lib/oracle/instantclient
patch --dry-run -i config.m4.patch
patch -i config.m4.patch
phpize
./configure --with-pdo-oci=instantclient,/usr/lib/oracle/instantclient,11.1

##

If you get an error as follows…

checking for PDO includes… checking for PDO includes…
configure: error: Cannot find php_pdo_driver.h.

Then you may get this fixed by doing…

ln -s /usr/include/php5 /usr/include/php

And you can continue by retrying the configure.

make
make test
make install

## Add extensions to PHP

# Create /etc/php5/apache2/conf.d/pdo_oci.ini

echo "extension=pdo_oci.so" &gt;&gt; /etc/php5/apache2/conf.d/pdo_oci.ini

## restart Apache

/etc/init.d/apache2 restart

Congratulations you made it!

## install Drupal!
Read the INSTALL file in the Drupal oracle module. It must be put in a special place in Drupal’s filesystem!

Now, although this all worked for me, in the sense that a phpinfo page returned an “Enabled” for PDO_OCI, but, crucially still failed on the database “upgrade.php” step from Drupal.

Arggggggggggggggggggghhhhhhhhhh 😡

Thankfully, Google to the rescue again.

http://drupal.org/node/1029080

cd includes/
ls -al
cp update.inc update.inc.bak

So, pulling down the patch (copy/pasting from http://drupal.org/files/1029080-update-database-pdo-rev3_2.patch )

Put this into a file, update.inc.patch

vi update.inc.patch

Or pulling down the file directly to the patch file.

wget -c http://drupal.org/files/1029080-update-database-pdo-rev3_2.patch -O ./update.inc.patch

Now – here I do something different, I edited the file because the file locations were different in my instance- i.e. my file wasn’t in a/includes, and I was running the patch directly from <drupal_install/includes/>

From:

diff --git a/includes/update.inc b/includes/update.inc
index f7c7b66..83fa6e4 100644
--- a/includes/update.inc
+++ b/includes/update.inc

To:

--- update.inc
+++ update.inc

Now to do the patch

patch --dry-run -i update.inc.patch
patch -i update.inc.patch
ls -altr
diff update.inc.bak update.inc

The output of the “diff update.inc.bak update.inc” should match the “update.inc.patch” file.

If it does, everything has gone to plan.

Now re-run <http://site/update.php>; et voila!!! All working.

Wow, what a pain on a Bank Holiday.

I have seen the Light(room)

Well, I blame my friend Cath….
And I am sticking to that.

It is, as always with Cath, in a good way.

The other week, a group of us went to RIAT on the Sunday this year for a good day out, and the opportunity to have some good shooting of fast jets (and not-so fast prop planes, but with the BBMF – awesome planes)

Well, about 24GB of photos later (for me anyway), some of us were sat around munching pizza and reviewing our great (and sometimes not so great, again, more referring to me here) shots.

I fire up mine in Linux using Gnome Photo Viewer, Cath fires up Lightroom, and detail freak fires up Windows 7.

Well, that’s a pretty full complement of alternatives between us.
That, in it’s own right is hilarious, and worthy of a celebration that there is that much choice about, as we have all, in our own way, taken a different path.

However, the one thing that did stand out, at least to my mind, was the power of Lightroom.
Who used that out of the three of us ? Yup, the pro photographer, and yup, of course, on a MacBook.

Opened my eyes I can tell you.

I am used to copying all my files off my CF cards onto an external hard drive.

I then write a script to go through all the files, do a little re-name on them, and then create a new directory, and then move all the RAW files into that new directory.

As I am paranoid, I then back them up to another hard drive, and then sync one of the two external drives to a NAS.

I then go through all the files using Gnome Viewer, and each time I get to a photo I class as “best” – to at least worthy of using, move it to a directory called “Best”.

If those photographs are feeling a little, “missing”, then I will fire up UFRAW, and do a “fix” or, enhancement with that, and maybe use GIMP to do a crop…

I then re-sync all 3 devices, 2 ext drives, and the NAS, so I’ve not lost anything.

Cath showed me, through the gift of Lightroom a whole new, more efficient way of working, just use Lightroom to import the RAW photos, rate them how good they are, and then process them, including tweaking/correcting them for body and/or lenses.

… Oh – and tag them with the date, context, place etc….

And sod the JPG’s from the camera, just take RAW, and process them in Lightroom to JPG if required for my web gallery. (and even print them from Lightroom)

So – guess what happened the following week at work – when my work PC decided to tell me my Adobe Flash was out of date????

Show me a link to a trial download of Lightroom – so, once home, fired up the link again, and dropped myself a copy onto my MacBook.

I’m hooked.
Completely.
… and utterly.

I put it to the test on the photographs I took at RIAT.
Out of the 1800 or so that I took, I rated them, and ended up with 255 shots that I then processed.

Lightroom also then processed them, correcting for my lens (my Sigma 120-300mm f/2.8), then processed them into a web gallery, using flash to create a slideshow, which I uploaded here.

I’ve called it “OLD” as I don’t expect it to be around long, as whilst it was a good test of what Lightroom can do, it doesn’t fit in with the gallery software I use to showcase my photographs, and to be honest, much as that software has been a pain recently, I don’t think I will be replacing it any time soon.

One of the things I will say, is that there is a considerable difference between the JPGs produced by the camera and the JPGs produced by Lightroom.

IMHO, and it is very humble, I am really preferring the output from Lightroom, even with almost no “tweaking”.

Again – let me say that again – I prefer the output from Lightroom.

Is this me looking at the JPGs produced from the camera on one machine against ones on another machine – NO.

Same machine.

And – yeah, I have calibrated the screen.
And yeah, in the same program – in the case of viewing the output, this was using either Safari or Chrome on my MacBook.

The JPGs produced by the camera are too “watery”, it’s the best description of the output I can give.

They are lacking a certain something.

It’s a frustrating position.

It means that on each every shoot I need to process the shots through Lightroom (or equivalent) to get a JPG out for display in a gallery.

It’s a different way of working I guess, but, as I alluded to earlier, I think more efficient.

Well, me being an Open Source software advocate, is there something I can put onto my Linux machine that does the same thing as Lightroom ?
Would it give me the same workflow ?

I had a look around, and found a GPL’d piece of software called Darktable.
I’ve aded the Ubuntu repositories to my 10.04 LTS box (that yeah, is no longer supported, but need a bigger HDD to do a reinstall) installed it, and will report back on it’s functionality as and when.

Whilst I was looking at Darktable tonight, I looked at the status of GIMP and 16bbp editing, something that I personally haven’t worried about, until now.

At the moment, it works, but only apparently in the development versions, nothing stable.
OK – for now, putting off using it for proper manipulation.

There is one other thing that is rather interesting with all my investigation and trial work.

At *NO* point did I even contemplate going down the Windows route and whatever tools I could get on Windows – Lightroom/Photoshop included.

It simply didn’t even enter into my mind as an option, not until I had almost finished writing this entry.

Is that because I haven’t used Windows as a home tool in so long now that it is out of my conscious thought ?

Is it because I don’t know Windows well enough these days to make an informed decision ?

Actually, I don’t know, probably the former, even though I have to use Windows at work, its use as a tool for me, is just irrelevant.

With whatever the outcome of my experiments with Darktable, and the wait for GIMP to do 16/32bbp, I’ve decided it’s high time to finally bite the bullet, and go Lightroom and Photoshop on my MacBook.

I just don’t think that as a photographer, ameteur as I am, I can justify *NOT* going down this route now, not after seeing the power and simplicity that it can give me.

Even though any photo reviewing and processing is going to take some time, like it did in the days of film, anything that can cut it down, and gain me extra time in the “field” shooting, that has to be a blessing.

For the moment, for me, that is Lightroom at a minimum.

Doing it this way does give me a minor headache now though….

OSX doesn’t read my Ext3/JFS formatted disks….

That, however, is a problem I will look at another day.

Thanks Cath, I have a different photographic related problem…. x 🙂
(One I prefer to be honest)