Still not sure exactly why, but i have been considering a tatto for a few months. However I don’t get the traditional tattoo artworks, gothic imagery, skulls, names etc etc.
So the hunt was on for something different, original and most of all very me.
Thought about various electronic references, ohms law, resonant frequency or Q formulas, but a bit to obscure for main stream people.
So along came bar codes, plenty of forums later, I started to look at standard type 39 barcodes, but to get something meaningful in, and have a chance of it scanning, it would need to be VERY long
So 2D barcodes had to be considered, QR Code seems to be the current 2D format that most people recognise, and is supported by plenty of moble phone scanners.
Still has to be quite large to scan, but it just had to be done.
After 2.5 hours sat in a Gloucester tattoo studio, I now have a QR Code Tattoo, best of all it scans too!
Though i am not sure if Greg will ever do another QR Code.
Testing geotag feature in android wordpress app
About a month ago I was in an infrastructure meeting discussing backup strategy. The conversation was around disk based backups such as near store and Virtual tape libraries.
So I asked the question “do we know how these disk based backup system cope with earth quakes?”
As yesterdays data could be on the same disk that is being access today, a disk level corruption could be “interesting”
As the point in time I got laughed at, “Earthquakes, yeh right !”
Does not seem such a silly question now does it !
What make a good technical Curriculum Vitae?
The Internet is flooded with differing opinions,
The only consistent thing seem to be lots of white space to aid reading and the following points in order
- Personal details
- Career Objective/Summary
- Personal skills
- Summary of skills (the technical page)
- Work history (listing the most recent position first)
Outside of that, 2 pages, 4 pages, or 10 (if you have enough relevant content)
So lets say you have 4, will one of the 2 pages max HR / PIMP people chuck it straight in the shredder.?? Or you don???t have time to tweak a CV for a particular post, and they are looking for a skill you have left out to make it fit 2 pages.?? Catch22
Suggestions on a postcard !
My son got a copy of the Nintendo DS transformers game for XMAS, and decided to try out the WiFi features over the weekend.
I had heard the some WiFi routers and access point do not work well as the DS wireless if not fully WiFi compliant. They aren???t kidding, more like not RFC compliant in many areas.
NO support for NON wep security
A TCP/IP stack that does not confirm it has taken the DHCP address it was offered and then does not respond to ping, instead it just uses it and says nothing, so the next new device connects to the network, gets the same address, and the DS gets kicked off,
Over the xmas break i thought i would review my data backup process, i have a few friends with kit at their house linked to mine via long range WiFi !, so File replication between server seemed like a good plan !
Got some big disks attached to each server using??USB 2, and setup replication, Sorted,
NAH, nothing is that easy,
NTFRS has got to be the most quirky thing ever to setup, my first word of advise, is go for a pint after you configure it, and BEFORE you put any data in the replicated folder. no in fact, that is No 2. No 1 has to be test you infrastructure HARD first.
My 1st snag was a delayed write error on my USB2 disk, which corrupted the USN data for the replicated folder, and completely screwed the whole replication process, after some soul searching, crying, fretting and LOTS of cursing. I reset the entire NTFRS config for that server.
Sorted I though, no such LUCK
another Delays write error, Closer inspection revealed i had been a cheap skate when i added USB2 to my aging server, it had a ???VIA??? chip, DOH!, DOH!, DOH!, DOH!, DOH!
Quick trip to a local computer retailer?? (not PCWorld), we have a local back street guy who is VERY good,
and i kissed good bye to the I/O issues. This is cool so i thought, all my data became “pre install” and everything seemed to be fine,
This is where the pint comes in, I “moved” (big mistake) my data from the pre_install folder to the replicate folder, and everything looked good, get up the next morning, NO DATA AROOGA AROOGA CURSE CURSE CURSE (BTW this was XMAS day, i have a record of upseting the wife, but tellting her i may have lost all the photos on XMAS would surely top it all)
after a week of searching volume shadows copies and offline data on all my PCs and friends servers, I had successfully recovered ALL our data, (can i have my testicles back now please)
so 1st things first MAKE Copy or three, and try again !
This time i did go for a pint or 3 before “COPYING” in my valuable data, and all i NOW Ok
all the data if almost fully synchronise across the servers, well 150 G of photos and other random data, will take a while to sync over wifi !
Some serious Lessons learnt
1. don???t fuck with your home IT on Xmas Eve
2. don???t rush NTFRS config
3. don???t MOVE data when you have the space to copy it
Going back towo years i took a geek test, well in a moment of boredem i took it again,
Its bad news i am afraid
Overall, you scored as follows:
0% scored higher (more computer geeky),
0% scored the same, and
100% scored lower (less geeky).
Compared to those in the same age group as you:
0% scored higher (more computer geeky),
0% scored the same, and
100% scored lower (less geeky).
What does this mean? Your computer geekiness is:
Step aside Bill Gates, Linus Torvalds, and Steve Jobs… You are by far the SUPREME COMPUTER GOD!!!
I am currently looking at Enterprise RSS technology. What a mine field !
No one knows what they want, let alone how to do it.
So I look at it like this.
1. Central location to discover internal and external RSS feeds
2. RSS feed consumption stats
3. Easy to use
4. Easy to configure
Newsgator seem to provide all of the above. However with the small problem of a proprietary clients which then have to be deploy to 80,000-100,000 desktops (in my case), and you can guarantee that a large portion on the community (mainly the geeks) won???t like the client. So what you really need is a RSS Proxy server,?? that will consume, cache and aggregate RSS feeds and republish them as standard a RSS or ATOM feed, which can then be consumed by the users favourite RSS client (outlook 2007, IE7, Vista, Feedreader etc etc )
Only found one so far http://gregarius.net/ which is open source PHP, I have budget and can buy a solution but can???t find one.??
Are my requirements that bizarre ???
A corporate version of FEEDBURNER woudl be perfect
The Last day ….??My brain is close to melt down, so i only attened a few sessions today, Stuck some labs in the gaps to practice what we had been learning.so
Session 1, Delivering rich media with server 2008, shame the presenter could not deliver a rich presentation.
at least the new media extensions support MP3s and other file formats, there also seem to be a better level of integration between HTTP and MMS delivery of content, shame there is NO API for automatic creation of play lists in the current beta, so i may just have to pop to http:/iis.net and have a whinge..Session 2,?? Q&A with the REAL IIS team, Hard core stuff but i will try to summarise what came out of the session, following many Questions from myself and a few other
IIS is no longer supported with failover clustering, a large debate ensued over why cluster instead of load balance
The ability to initiate a worker process re-cycle still requires admin, and therefore cannot be delegated, like other feature of IIS7, but ops manager may provide a solution there
There is no feature to do logging by folder for micro sites, but the integrated pipeline architecture would make it easier to write a plug in module to achieve that,
Modules can be written in managed code (.net) as well as native code (c++) and thre is a good SDK
Apparently if you install a *.domain SSL Cert you can do host-headered SSL…. ( this was available on IIS6 and something??I??did not know)
There is a NEW feature on Worker process timeouts, at a server level, you can enable dynamic timeouts, if the server is under stress, it will adjust the idle timeout of a worker process to free up memory. Something??I picked up on though, this is a server setting, so in a shared environment where some heavy apps co-exist with lightly used apps, the session stability of the lightly used apps could suffer. Yet another excuse to go to IIS.NET and have a whinge, sorry provide constructive critisism
Just in case anyone thinks a week at TechEd in Barcelona is a jolly, you are sadly mistaken. FIVE days of rapid fire presentations is total DATA overload, which despite the doing apparently nothing is extremely tiring.
So Day 4…..
Session 1. PHP on windows, now don???t laugh. Monday saw the release of fastCGI for windows server 2003, this is a Microsoft bolt on to run supporting CGI apps almost 15 times faster. CGI is traditionally very slow on windows due to the over head involved in creating and destroying a NEW process. CGI creates a NEW process for EVERY request. In fastCGI the process is persisted and re-used offering almost a 15 folder performance improvement. Add to that you can use asp.net forms authentication in front of the zend PHP and obtain the login credentials i may just have to try standing up a copy of WordPress on windows, Cuz I can !!!
Session 2. IIS7 on Server core, Server core is the NEW low foot print install for windows server 2008, it is not actually a SKU, but an install option you buy standard edition licence and install CORE or Full, (I wonder if they will rebrand ARCADE edition like the XBOX 360) Core will present some interesting challenges for the ???right click??? boys, as there is NO GUI, it is CMD line (hooray) or remote admin. I have never fully understood why a server needs a GUI, but my first server was novell 2, Shame that server CORE does not support managed code, so you can’t install a .net frame work, which sort of cripples a windows web server
Session 3. SQL Server indexing by Kimberly Tripp, OH MY GOD,??a serious session on SQL indexing and tools and scripts., using DMVs on SQL 2005, she demonstrated scripts she had created that woudl recomend index creates and index drops. the most enlightening statement she made (which made so much sense , but i had never though of it before) dont just look at the long runing queries, but run a trace over the a fully daily workload and look for common queries,??shaving 50% of a query which is run 5000 times day provides better overall performance gain than looking at the long runing query which only runs a few times a day.