Ignored By Dinosaurs 🦕

I'm sorry, but if there's one thing I love doing, it's taking Drupal down a peg.


I'm currently investigating doing some real time push notification work on my company's sites to make them more buzzword compliant. This is great because it finally gives me a bona fide excuse to dig into a tech that I've been wanting to find a nice small use case for for a long time – Nodejs. We could easily outsource this piece to something like Pusher, but we outsource a lot of pieces of our architecture, and with each piece comes a little accrual of technical debt. We might be able to skate by without ever having to pay it off, but we've just recently gone through a large exercise with Exact Target, our email service provider, that was vastly less than smooth. So buy in to investigate the merits of keeping it in house is what I got.

Now real time notifications isn't exactly setting up a Wordpress blog, but it's also a pretty well solved problem in this day and age, and the one use case where Node just absolutely earns its bread, so we're looking into Websockets and tying into some simple events in the Drupal stack. I was just thinking about the presentation I went to at DrupalCampNJ a couple years ago by the author of the Nodejs module. His was mainly a plumbing job to expose some of Drupal's hook lifecycle to the Node event loop, and may very well end up being something we leverage, but this phrase popped in my head.

The last thing on Earth I want to do it to couple more shit into Drupal. What I want to do is to break Drupal into little pieces, but it just keeps getting bigger and bigger. Not unlike how Bear Stearns and Wachovia got absorbed into larger banks that then became even larger banks, Drupal is *too big to fail*. I think we're in the twilight of the monolithic CMS age, but plenty of folks are betting that we aren't. I suspect we'll all have jobs one way or another, but something is just fundamentally unsound about the approach with D8. To me. I am a terrible programmer by the way, vastly inferior to all core Drupal devs, lower than dirt. Fair disclosure.

#drupal

There's a question currently on r/drupal that asks the question “What's the best way to get your head around Views?”. There are many excellent answers — “study the Views API file”, “get to know the UI “, “fuck Views because it writes poor queries” — but none of them, to me, really answer the question.

Views, for the uninitiated, is the open source, community contributed module that is the reason Drupal is the powerhouse that it is currently. Yes, there are many excellent Drupal features that have contributed to it's adoption across the CMS marketplace, but there is no other contrib module that increases Drupal's capabilities so vastly as Views. Views is “a query builder”. That means lots of things, since almost any modern CMS is simply a front end to a database somewhere, and the very act of clicking any button on almost any website means that a database is being queried somewhere in the distance. Thus, a “query builder” is a pretty cool tool to have at your disposal. You can contort almost any conceivable feature out of Views if you really know your way around. But how do you learn your way around? To me, that is what the author of the question was getting at.

Well, if you came to Drupal the way I did – trial and bumbling error, and not via a CS program somewhere – then you might not be surprised to learn that a “View” is a standard feature in most RDBMSs – Wikipedia has a great entry. In essence, a “view” in SQL is a predefined query. Views allow a DBA (database administrator) to build up a more complex query that they can then hand off to a “normal” user to use in day to day operations. Maybe this query has a several joins and numerous where clauses that are tough to remember but never change, but the business user needs to supply one varying parameter to get the results they want. Another use case might be limiting access to the DB by granting users access to the views and not to low level querying of the DB (for security reasons). Thus, the seemingly awkwardly named “Views module” actually does exactly what it says it does, if you know the terminology.

Thus, the best way to learn Views is to learn SQL itself. Views' strange terminology (contextual filters, relationships, etc) are just different names for standard query features in MySQL or any other relational database system. Once you start poking around the standard Drupal DB schema, and start stepping through how a simple Drupal View is put together, you can start to understand the deeper mechanics of how the code works under the hood.

When beginning with a new view, the first question you are asked is Show _______ of type _____ sorted by _______. This is the bones of a very simple select query. Show _____ is asking which table in the DB is going to be the base of this query, most often it'll stay on the default “content”, which means the node table. Of type _____ says where type = whatever and sort by _____ does just that. So you end up with something like ...

SELECT * FROM node WHERE type = article ORDER BY date DESC

... and you're off to the races. The rest of the Views wizard allows you to refine this query to (hopefully) pull out what you want to display on the site.

The key to learning Views, therefore, is learning the Drupal database structure in general, and how to query it in straight SQL to get what you want. Once you've wrapped your head around how to join the users table to the role table via the users_roles table in order to pull out every user who is an admin via the mysql command line, it becomes much easier to translate this into a much quicker job in the Views UI. Soon you'll notice that the blocks, users, and comments tables are all plural while node, contact, and role are all singular, and then you'll be well down the path of a deeper understanding of what makes Drupal such an absurdly powerful CMS.

#drupal

After reading the news about Fastcgi support landing in HHVM, I had to finally give it a try. I'll assume that you're at least familiar with HHVM and it's design goals, but if you aren't the HHVM wiki is a good place to start. In a nutshell, it's a new PHP runtime courtesy of Facebook that, if you can get it working, promises to run circles around any PHP interpreter currently on the market.

So I came into work on Wednesday fired up to give it a try. I wasn't expecting anything at all, since the sites I work with for my day job are pretty large and have a decent number of contrib modules installed. In case you're wondering, core Drupal is supposedly 100% on the HHVM now, but contrib is a different story.

The first thing you should know is that it only runs on 64-bit version of Ubuntu, so head over to Digital Ocean and fire one up. I prefer 12.04, so that's what I conducted this experiment on. The first link above gives instructions on how to install HHVM via apt, so that's the route I went. I first tried on the tiny little box that this site runs on, which is a 32-bit version of Ubuntu, and while the apt repo would update itself with the new HHVM repo, it wouldn't install. So, onto plan B, which involved a 1G box running a 64-bit version.

This one installed from the HHVM repo without a hitch — init script in /etc/init.d/ which pointed to some configs in /etc/hhvm. A quick perusal of that config script, which looked very much like an Nginx config, looked pretty straightforward. Installing Nginx and git and everything else I need to stand a site up was routine. Looking good. So sudo service hhvm start, and we're off to the races. top showed the hhvm process running as www-data so I hit the root URL. Page not found was the only feedback I got. curl -I gave me an x-something-something header that said HPHP, so I was puzzled. HipHop was listening on port 80 and was directly catching the web traffic, as opposed to standing behing Nginx and listening on port 9000.

It took me about 30 minutes of fiddling around, but the real clue was in the instructions for how to start HHVM on a system that doesn't have a repo installer.

cd /path/to/your/www/root
hhvm --mode server -vServer.Type=fastcgi -vServer.Port=9000

So, I took another look at the /etc/hhvm/server.hdf config file that the init script was pointing to and noticed that it was set to listen on port 80, not port 9000, and it was set inside of a Server {} block. That Server {} block that looked like the perfect place to put Type = fastcgi, so I did, and changed the port to 9000. The docs indicated that the process needs to be started from the root of your PHP app, but that might only apply to non-fastcgi HHVM. I started it from there anyway, and I finally had the thing working.

Standard “standing an existing Drupal site up on a new box” fiddling around with connections and file system settings and I actually got the thing to stand up after about 90 minutes of playing with it.

Thoughts

I'm a pretty good sysadmin for a front end developer, but I'm still just a front end developer. It was, however, very easy to find my way around the configs and start to get a sense of what HHVM does. I ended up upsizing the box to a 4G instance for a little bit after it started giving me some memory-related errors that I didn't have the skills to diagnose. I have no idea if that much RAM is needed, but it cost about $.20 to bump it up for a couple hours and find out.

The beautiful thing about it was that once I finally figured out how to stand it up correctly, my existing Nginx/FPM config (a derivative of Perusio's) worked out of the box with absolutely no other intervention. When I got stuck once by a cache clear that I suspect was the real cause of my memory issue, I shut HHMV down, brought FPM up and got unstuck. After I was stabilized again, I shut FPM back down and brought HHVM back up. It was seamless.

I finally gave up after a little bit because although it was working, the HHVM logs were full of notices, and I repeatedly hit a wall with some function, first in the gd lib and then in apachesolr, that didn't like the input it was being fed. I had to get back to “real work”, but I will dive back in very soon and hopefully be able to contribute some feedback. The maintainers are extremely friendly and active on IRC.

I'm very, very excited for the future of this project and would highly recommend giving it a try. I was amazed that I was able to get anything to load on it at all, and am salivating at the thought of actually getting the VM warmed up with a bunch of repeated requests. When it is ready for prime time across a wide variety of PHP apps, it's going to change the way people think about interpreted languages in general, and is going to single handedly contribute to a faster web. For this, we thank you Facebook.

#drupal #devops

I've recently finished up a project here at the job that gave me a blank check as far as writing the front end code was concerned. It was among the most blissful Drupal projects I've ever worked on, as my boss did all of the Drupal stuff, and I wrote all the code. It was heaven.

So, there were a lot of requests for some cool javascript features, and rather than reaching for the plugin drawer, I decided to write most of them from scratch. The main feature pages are mostly like this. The left hand “scrollspy” navigation was ripped off from Twitter's Bootstrap UI library, and (imho) warrants it's own write up, as it's some of the coolest code I've written yet.

Most of the moving features on the site, especially the left nav when activated and the contact tab flyout thingy were written initially using jQuery to animate positioning and display properties on the DOM elements themselves. “60 fps” is the battle cry of the UI engineer this month though, so I thought I'd try a few new tricks, namely swapping out those jQuery animations for CSS transitions. Turns out this is insanely easy, requires way less code than the previous alternative, and will outperform the JS implementation any day of the week.

var ABM = window.ABM || {};
ABM.contactFlyout = {};
 
ABM.contactFlyout = (function() {
 
var $ = window.jQuery;


var cloneContacts = function() {
// Clones the contacts div, which already exists on the page,
// attaches the copy to the page elsewhere where it can be 
// persistent, and slid out with a toggle.
};
 
var attachToggle = function() {
  $flyout = $('#flyout-wrapper');
  $('#toggle', $flyout).on('click autoFlyout', function() {
    var posY = ($flyout.hasClass('open')) ? -670 : 0;
    var winWidth = $(window).width();
    $flyout.animate({
    right: posY
    }, 350);
    $flyout.toggleClass('open');
    $(this).toggleClass('open');
  });
}

var flyoutFlash = function() {
// A thing Marketing wanted where the flyout would popout if you had never
// visited this page before, determined by a cookie.
}

// positions the contact flyout
var makeSticky = function() {
// SSIA
}
 
return {
  init: function() {
    if ($('.lt-ie9').length) return;
    cloneContacts();
    attachToggle();
    setInterval(makeSticky, 250);
    var cookieRE = /flyout/;
    if (!cookieRE.test(document.cookie)) {
      flyoutFlash();
      document.cookie = 'flyout=flown';
    }
  }
}
 
})();

The initial implementation.

So, obviously, the meat of the animation is in that attachToggle() method, which is totally poorly named and I'll refactor that right after I finish writing this. But, it just animates the positioning of that whole div, totally simple, right? So the only thing that's animating the is “right” property of that div, maybe we should try a CSS transform on that instead?

Turns out all that involves is removing that animation bit so it looks like this —

var attachToggle = function() {
  $flyout = $('#flyout-wrapper');
  $('#toggle', $flyout).on('click autoFlyout', function() {
    $flyout.toggleClass('open');
    $(this).toggleClass('open');
  });
}

And then the stylesheet for that div goes from this —

#flyout-wrapper
  position: fixed
  width: 700px
  right: -670px
  top: 20px
  z-index: 3

To this —

#flyout-wrapper
  position: fixed
  width: 700px
  right: -670px
  top: 20px
  z-index: 3
  transition: right .5s
  &.open
    right: 0

Yeah. Seriously. Instead of animating whole element, try just adding a class to it – the after state – and animate the in between with CSS. I did the same thing with the left nav dropdown bit. The whole page is a lot smoother now and I'm going to rework the main nav as soon as feasible.

Edit

If you're animating the position of something, say a #page-wrap element for an off-canvas menu, and it should present in a “normal” state when not activated, you still have to specify the default positioning (ie. left: 0), or the transition will not work. Just wasted too much time figuring out why my off canvas nav wasn't working like I wanted it to.

#ui #javascript #css

Prologue

I have a lot of friends in successful working bands that all have one thing in common – a fairly useless website. Pretty much every band I know, with a very few exceptions, treats their website as a tour poster. Most of them have some kind of “store”, but frequently they're stocked with leftovers from last year's road merch. You often have to go through PayPal to checkout. I don't know of anyone short of Radiohead that actually lets you download digital goods directly from them.

Railscasts, Peepcode, Lullabot, and every other player in the digital goods space that caters to developers have had their act together on this for years now, but for some reason the real promise of the internet age hasn't paid a visit to the music business yet.

The promise

Perhaps I misunderstood, but the real promise of the internet age as it pertains to musicians was that the old, centralized means of distributing your music were going to be torn down and in it's place would be a democratic, no-barriers system for getting your music out instantly to the whole world. Services that have sprung up in the last 10 years – iTunes, Spotify, Soundcloud, and the like – have almost universally propped up this old, centralized system. Services like Distrokid are well intentioned and barking toward the right tree, but still miss the point.

The problem

The problem is that the music business was so profitable for so many decades, so many people got rich off the old label system that the world in general is clearly having a hard time letting go. Every single service that I've mentioned so far has some sort of “gatekeeper” mechanism in place, be it label affiliation or whatever, and profit motive for the business owners as a central tenet.

“How can I (or my investors) make money off the music business?”

What I haven't seen done

What I haven't seen done yet is for someone to come along and offer a service that has altruism toward the musicians that this whole business revolves around in the first place as a fundamental principle (distrokid partially excepted).

The solution

How about someone build an open source CMS for bands and musicians, ala Wordpress since every band I know is already on that and comfy with it, but built with commerce as the fundamental purpose of the site? Plug in a Stripe API key for credit card processing, an S3 key for storing their digital goods, and tada! – you can sell your own music through your own website and keep 100% of the net for yourself.

Obviously the simple features that every band wants – tour dates, photos, bios, etc – would be there, but instead of having the store be a page on the site, have the tour dates be a page in the store. I know they must be out there, but I literally don't know of a single band that approaches their online presence this way.

But, but, how do I make money off this?

I dunno. How about a hosted service, ala Wordpress.com for bands that don't want to deal with setting it up themselves? How about taking a percentage of fees for ticket sales if they want to activate that module? How about consulting fees for custom implementations? I think there are actually plenty of ways, but only if you start with the libre, open version at the core.

Postscript

There must be someone out there working on this idea – I had it years ago and it just seems too obvious at this point. I've been working on it off and on for most of this year. I've even got a guinea pig client lined up, but I'm that stereotypical musician that has kids, gets off the road, and is now so busy with my 9-5 and my 3 kids and trying to find contract work to fill in the financial gaps that I can't keep the momentum going to get it finished and launched. So...

If this idea is interesting to you and you feel like building it, call me. If you're already building it and want someone to help sell it, call me. The band I quit 4 years ago sold 9000 tickets at Red Rocks this summer. I'm now in a band with two guys from Leftover Salmon and a guy from String Cheese Incident. If you've never heard of these bands then you're not a hippy, but trust me – there is serious money to be made in this grassroots, live-music, no-label-affiliation sector of the music business if for no other reason than nobody is looking at this market at all, and the customer base in this market is extremely loyal. And I've got an iPhone stuffed full of potential clients.

All this product would have to do is show some real revenue increases for a couple of bands. Once that happens, the real promise of the internet as it pertains to the music business can start to be realized.

Postpostscript

I'm a developer now. I'm handy enough with the backend, and what I consider pretty good with the front end of things, especially as far as the technical needs of this project are concerned. Point is, I can contribute a lot – business, development, a laundry list of clients, implementation details (if you want em) – I just can't do it all myself. I've been holding on to this idea for so long, trying to build up the dev chops to execute that it's starting to eat me, especially since I now have the dev chops and no time.

So if you're reading this and it strokes a chord, drop me a line – therealjohnnygrubb@gmail.com

#music #business

Hi there, probably-front-end-dev-who's-met-and-used-Sass-and-likes-what-they-see. This is for you.

RubyGems

Sass is made out of Ruby – it's a very pleasant, general purpose programming language that's pretty easy to learn and like. Ruby has a package management system whereby libraries of Ruby code are bundled up into what's known as “Gems”. Sass is a gem. When you install it, you get a couple of new executables to play with in the terminal, namely sass and sass-convert. The latter of these will help get you started with Sass by converting your straight CSS to Sass. RubyGems inspired PHP's new-but-already-dominant package manager, Composer.

rbenv

If you are a Mac user, and you are using the version of Ruby that came with your Mac, you are using a version of Ruby that's actually beyond End Of Life. If all you're ever interested in is Sass, it'll keep working for a while longer but eventually you'll be left behind. A relic. This is the bad news. The good news is that the Ruby community has been working on this problem for a while.

[!info] Because Ruby 1.9 came out a while back and has a bunch of cool new stuff in the form of performance enhancements, syntactic polish, and overall love via it's contributors, and because 1.8 is in life's endzone, and because using outdated versions of open source software just isn't your preferred thing, you'll want to use 1.9. This is how.

The most commonly blogged about solution to this in the Ruby world is RVM. We're not going to talk about that. We're going to talk about a solution called rbenv. Rbenv is a more recent and lightweight solution to this multiple Ruby versions problem that doesn't require sudo to install and update Gems, and allows you to install almost any version of Ruby you desire (of which there are plenty, but that's more than you need to know right now).

Rbenv works on any *nix based system and installation is super simple

$ git clone https://github.com/sstephenson/rbenv.git ~/.rbenv

This installs rbenv, the version manager. Add rbenv to your $PATH -

$ echo 'export PATH="$HOME/.rbenv/bin:$PATH"' >> ~/.bash_profile

$ echo 'eval "$(rbenv init -)"' >> ~/.bash_profile

(Zsh users put those two lines in ~/.zshrc).

You might as well go all the way and install the Ruby version installer, a separate tool – ruby-build.

git clone https://github.com/sstephenson/ruby-build.git \ ~/.rbenv/plugins/ruby-build

At this point, you'll reload your shell – exec $SHELL – and you're ready to rumble. Ruby 2.0.0 was released earlier this year, so unless you really like living on the edge, 1.9.3 is a safe bet.

rbenv install 1.9.3-p448 – (the most recent release as of this time, refer to changelog).

I almost forgot to mention rbenv rehash – probably the rbenv command you'll use the most. “rehash” basically tells rbenv to reload itself after you gem install any new gem that comes with an executable (like Sass). If you install a new gem and for some reason your computer acts like it has no idea, it's almost certainly this.

Incidentally, both of these tools were written by the same guy – Sam Stephenson. He works at 37 Signals, the home of Basecamp, the original Ruby on Rails app, created by a mystical figure known simply by his initials.

Matz/DHH

Super quick Ruby history lesson...


Ruby recently celebrated it's 20th birthday which, without doing any consultation of Wikipedia, makes it roughly the same age as PHP. Ruby's creator and spiritual leader is a guy named Yukihiro Matsumoto, or Matz for short. PHP obviously grabbed it's share of the market more quickly, and Ruby had scarcely gotten off of the island of Japan for about the first half of that until it was catapulted onto the world stage by one man – DHH.

DHH is a charismatic developer from northern Europe with a fondness for business and hair gel. DHH cast off his PHP chains when he found Ruby, created an honest to god framework out of it, open sourced it, and then ran with it. Much of Rails' rise to prominence coincided with the rise of Github and the two together are probably largely responsible for touching off Git's adoption in the greater marketplace.

Rails has impacted the design of almost every single web framework that has come since in any language, either directly borrowing from it's ideas or reacting to it's opinions. Sass came in it's wake, and here we are.

The Well Grounded Rubyist

If I can recommend one Ruby book to get, it's The Well Grounded Rubyist by David Black.

Black is one of the few western developers who has been doing Ruby since before Rails came along, and is a preeminent authority on the language. This book was my introduction to Ruby's version of OOP, which is indescribably more elegant, consistent, and to-the-point than PHP's, and reads almost like a great novel in the way that it builds in intensity from beginning to end and rewards repeated readings. No, I'm not shitting you.

#drupal #ruby #devops

Whenever a new developer shows up in some online thread asking for advice on how to learn to code, the replies always include “find an open source project to help with”. The 5th birthday of the Macintosh that I bought to learn to code is any day now, and I've just now worked up the chops and the courage to follow that advice. Here's what I'd say to a younger me.

When people say that, it's usually really intimidating to think about. What project? How do I get involved? What if I suck and get laughed off the internet? Well..

Pick a big one. Pick Drupal. Drupal is a huge, beautiful mess of an open source project and Drupal developers are highly in demand right now. This means that there is lots to work on, and when you've got something to show you can get paid decently well to do it. The advice is always to “scratch your own itch”, and indeed that's what pretty much every developer in open source is doing. I just had my first patch applied to a project. It took me about 3 weeks from start to finish, but the majority of that wasn't actually writing code. It was learning about what the other code did that I was trying to patch so that I could write a feature that actually followed Drupal conventions. This was a very simple little feature, but what I learned between 3 weeks ago and now ties into a LOT of core Drupal principles that have totally enabled me to write this other module that I need for work and have it work as intended on the first try.

So, in summary – help out on an open source project. It'll make you a better developer faster than anything else.

#drupal #opensource

What's the difference between assigning an anonymous javascript function to a variable or just declaring a named function in the first place? Turns out “hoisting” of the function only works if you declare it as a named function in the first place. Assigning an anonymous function to a variable doesn't perform the hoist.

(function() {
	console.log(f()); // 'hello'
	function f(){
		return 'hello';
	};
})();

(function() {
	console.log(f()); // undefined
	var f = function() {
		return 'hello';
	};
})();

#javascript


The program "postgres" was found by "/usr/local/Cellar/postgresql/9.2.4/bin/initdb"
but was not the same version as initdb.

I've been battling this for the last couple of hours, trying to figure out why I can't make Postgres run as easily on my desktop as I did on my laptop. Homebrew took care of it all, just leaving me with the agony of taking off the MySQL training wheels to figure out this new and scary Postgres admin syntax.

So I uninstalled the Homebrew version and went to the EnterpriseDB site and downloaded the official installer for Mac. This didn't yield any results either, and seemed to want you to use the GUI tools to administer it anyway. which psql kept giving me /usr/bin/psql, which should've been more of a clue, but I'm not that quick. psql --version kept giving me 9.0.2, which also should've been more of a clue, but I just figured I must've installed Postgres a long time ago and gave up and forgot about it.

Then I remembered. Mac OSX server comes with Postgres. That's why it's reporting /usr/bin for all it's paths instead of /usr/local/bin (the homebrew default), or /Library/Postgres, the official installer default.

There was also an unkillable set of _postgres processes in ps that I couldn't figure out how to kill. So, the flamethrower method is to delete everything in /usr/bin that relates to PG – psql, postgres_real, anything you can find. Don't forget /usr/bin/initdb because that's what was throwing the above error. Then you can get on with the homebrew installer.

#postgres #devops

I'm the honcho in charge of our in-house bug tracker – Redmine. Redmine is a rather large Ruby on Rails project, thus nobody in house when I started here had any knowledge or interest in maintaining the thing, since Ruby servers have a bad rap for being kind of finicky to set up, at least in relation to PHP. So it goes.

I recently upgraded to the lastest stable release – 2.3.1 – and decided to 86 Passenger as our app server in favor of Unicorn. I've been setting up all my Ruby servers with Unicorn lately and find it to be easier than Passenger, even tho ease of deployment is Passenger's whole selling point. I find the Nginx reverse proxying back to a pool of app servers, ala PHP-FPM that's running this site currently to be an easy mental model to get my head around. I get Passenger's modrails/modphp approach, I just prefer the other.

Anyway, sorry.

So I upgraded the whole infrastructure last week to Ruby 1.9.3-p392, Redmine 2.3.1, and everything went fairly smoothly. I was alerted to a bug this morning though, where attachments were being mangled. Basically, everything was being truncated to the first 48k in the file, and this applied to images as well as PDFs and Excel spreadsheets. I dove into the files/ directory in Redmine and saw that the files were all there, and that the file sizes were correct, so they were getting to the server, just not coming back.

I suspected I'd broken something with the new app server, but it took me a while to track it down. I'd previously been running the Nginx process and the Passenger process as the same user on the server, something I changed in this recent deployment. After trotting through all the Unicorn logs, the Rails logs, and what I thought were the Nginx logs, I found some “other” Nginx logs that happened to be the ones that were actually being used now. They were filled with this —

2013/05/22 14:13:58 [crit] 17604#0: \*9408 open() "/opt/nginx/proxy_temp/7/06/0000000067" failed (13: Permission denied) while reading upstream, client: 65.126.154.6, server: _, request: "GET /attachments/download/2323/AdvertiserEmailLeads_with_Verbiage_Changes.xls HTTP/1.1", upstream: "http://unix:/tmp/redmine.sock:/attachments/download/2323/AdvertiserEmailLeads_with_Verbiage_Changes.xls", host: "redmine.advantagemedia.com", referrer: "http://redmine.advantagemedia.com/issues/4958"

2013/05/22 14:14:43 [crit] 21936#0: \*8 open() "/opt/nginx/proxy_temp/1/00/0000000001" failed (13: Permission denied) while reading upstream, client: 65.126.154.6, server: _, request: "GET /attachments/download/1454/Balluff_112012html5.zip HTTP/1.1", upstream: "http://unix:/tmp/redmine.sock:/attachments/download/1454/Balluff_112012html5.zip", host: "redmine.advantagemedia.com", referrer: "http://redmine.advantagemedia.com/issues/4229"

So that's a good thing, because we're getting really warm by this point. Basically, the /opt/nginx/proxy_temp directory is full of proxy temp files that were still owned by the old nginx user. Now that the Nginx process was running as user nobody, the ownership and permissions were wrong. So a chown -R nobody on that proxy_temp directory and everything was right with the world.

#devops