- Query all articles that Author-X had written.
- Go to each article and scrape the contents.
- Save contents to text file.
- First there was Subversion and Git now there is TFS
-
First there was nHibernate, SubSonic, CSLA and
db4o and now there is Entity Framework
- First there was nUnit, xUnit now there is MSTEST
- First there was StructureMap, Windsor, Ninject now there is Unity
-
First there was Hudson and Cruise Control now
there is TFS
- First there was
Now there is <Some half baked tightly integrated products. - Schema(“viplookups.dbo”);
- Table(“bnkroute”);
- SchemaAction.None();
- Requirements/Design
- Treatments – RFQ/RFP
- Scripts – Detailed Requirements
- Green Light /Red Light
- Budget – Budget
- Schedule – Schedule
- Scope – Scope
- Team Parallels
- Director – Principle Software Developer
- Writer – Analyst
- Special Effects Supervisor – Research and Development
- Camera Operator – Software Developer
- Producer – Project Management
- Project Size Parallels
- Trailers – Demos/Prototypes/Spikes
- Shorts – Stories
- Television Shows – Deliverables
- Features – Projects
- Trilogies/Epics – Large Projects
- Requirements/Design
- Scripts – Detailed Requirements
- Scenes – Stories
- Shot Lists – Project back log
- Pre Production – Project initiation/planning
- Story boards – Mock ups
- Animatics – Prototypes
- Principle Photography – Software Development Process
- Setups – Iteration development
- Cutting Room Floor – Cutting features
- Post Production – Project Delivery
- Reshoots – Rework of current features
- Cutting Room Floor – Cutting features
- How does establishing a new metaphor help us as software developers ?
- How can we change our thought process to use the new metaphor ?
- Where does this metaphor fall apart ?
- How can we optimize our development process by changing our metaphor?
- What tools exist to make the metaphor work ?
- And any questions that I am sure you will come up with.
Hacking Websites with Ruby and Nokogiri
Every once and a while we get a chance to use software for the good of others. Last week I had just that opportunity. The story begins with a website called Cinematical. Cinematical is a website dedicated to movies of which I have been an avid reader for a number of years. Flash forward to 2011 Cinematical is now owned by AOL, who just recently purchased the Huffington Post. HuffPo has made the decision to move away from using freelance writers and editors to using permanent employees. This decision was codified in an e-mail last week where they were essentially told “We no longer need your skills as a free lancer but you can write for us for….. Free”. Nice touch HuffPo. If you are interested this article does a good job of explaining this situation: http://www.hollywoodnews.com/2011/04/07/boycott-aol-and-huffington-post-behind-the-untimely-death-of-cinematical/
UPDATE: A new first person account of the Cinematical Story:
http://www.ericdsnider.com/snide/leaving-in-a-huff/
So now we turn to last week. A couple of years ago I moved to Austin and it turns out that a lot of the freelancers who wrote for Cinematial live here and through happenstance I have become friends with a number of them. You can’t help it when you go to 100+ movies a year and see the same faces in the lobby. Eventually you have to say HI 🙂
Last week I saw a post from a friend of mine (we’ll call him Author-X) who is one of these free lancers. He was asking if anyone knew how to “scrape” all of his articles off of Cinematical’s web site. You see Author-X (and it turns out a lot of the other writers) didn’t save copies of their work and wanted copies.
Well I am a hacker and I really didn’t agree with the decisions of HuffPo so I decided to take on this challenge to see if I could help Author-X. So I cracked open my handy dandy text editor and started working.
It turns out this was pretty simple thanks to a killer Ruby Gem called Nokogiri ( http://nokogiri.org/ ). Nokogiri is a library that makes it very simple to parse HTML pages using CSS style syntax. So here’s the thought process behind this.
So here’s how it worked:
Query All Author’s Articles
Lucky for us Cinematical organized its articles by author. For instance Author-X’s articles can be found using this link: http://blog.moviefone.com/bloggers/author-x/ So this was to be our starting point. I opened his page and paged through his articles. They used a pattern of adding “page\2, page\3”, at the end of the url. Ok so now I now knew how to retrieve all of his articles. I simply needed to create a URL containing the page number and use that as my source of data. Here is the basic loop for doing this:
author = 'author-x' 1.upto(300) do |page| urltext = 'http://blog.moviefone.com/bloggers/' + author urltext << '/page/' + page.to_s + '/' doc = Nokogiri::HTML(open(urltext)) end
The next step was to see if we could pull the links to each article from each page of articles. Well this is where the real power of Nokogiri comes in. The beautiful thing about Nokogiri is that it allows us to parse pages using CSS syntax. Being an avid jQuery coder I was right at home. The code to pull out all links from this page is as follows:
doc = Nokogiri::HTML(open(urltext)) doc.css('h2 > a').each do |a_tag| puts "========================================" puts a_tag.content puts a_tag['href'] end
As you can see I simply look for all H2 tags with an anchor child item. Another nice thing about the Cinematical site was it’s consistent layout. Having a consistent layout makes it simpler to parse out. Lucky for us pulling the links to each article was as simple as the code shown above.
The last step was to follow each URL and retrieve the article content.
Go To Each Article and Scrape Its Content
Now that we had the links to each article we simply needed to load the contents of that page and strip the contents. We did this with another call to Nokogiri.
doc2 = Nokogiri::HTML(open(a_tag['href'])) doc2.css('.post-body').each do |article| puts article.content end
Once again the consistency of the site assisted. We simply needed to query for a CSS class called .post-body and we were home free.
The Whole Enchilada
So here’s the final code. We have a few items in there like saving to the file, exception checking (some of the URL’s were misformed), and an array of authors we wanted content for. Here ya go:
require 'rubygems' require 'open-uri' require 'nokogiri' authors = ['author-X','author2','author3', 'author4','author5'] authors.each do |author| f = File.new(author + ".txt",'w') 1.upto(5) do |page| urltext = 'http://blog.moviefone.com/bloggers/' + author urltext << '/page/' + page.to_s + '/' puts urltext f.puts urltext doc = Nokogiri::HTML(open(urltext)) doc.css('h2 > a').each do |a_tag| begin f.puts "===============================" f.puts a_tag.content f.puts a_tag['href'] doc2 = Nokogiri::HTML(open(a_tag['href'])) doc2.css('.post-body').each do |article| f.puts article.content end f.puts "===============================" rescue puts "ERROR DOWNLOADING:"+ a_tag['href'] f.puts "ERROR DOWNLOADING:"+ a_tag['href'] end end end end
As you can see this is pretty simple. I was amazed at what I was able to pull off in 38 lines of code.
Dear .NET Community, You Blew It!
Dear .NET Community,
You Blew It!
A couple of weeks ago a critical tool in the .NET ecosystem went from free to
commercial. That tool is Reflector and the owner of said product is RedGate
software. A lot of members of the .NET Community complained that RedGate was
going back on its original deal with the .NET Community by taking Reflector
from free to commercial. In all reality RedGate paid for the rights to Reflector fair and square and is within their rights to do exactly what they did.
“The chickens have come home to
roost.”
Translated : it’s your own damn fault.
What happened is you found yourself warmed under the blanket of a commercial software vendor who decided it was time to kick you out of bed. What was once free is no longer and you are now paying the price (literally) for your lack of attention. Time and time again the .NET community abdicates control of its own destiny to commercial vendors. Almost every year Microsoft releases products that compete with open source applications that are already available and in most cases better than the commercial products that Microsoft releases.
Here are a few examples:
Why we as a community do this to ourselves baffles my imagination. Let’s
take a look at what happens to a commercial vendor when they get out of line.
Last year Oracle bought Sun Microsystems. It did not take too long before Oracle overplayed their hand when it came to the many open source products that were in Sun Micro’s portfolio. What has happened since is a number of critical OSS projects have been forked and a new community was reborn.
Many of you might know about the product OpenOffice. OpenOffice is an OSS project that competes with the Microsoft Office suite. A number of the members of the OpenOffice community created a fork of OpenOffice and LibreOffice was reborn. http://en.wikipedia.org/wiki/LibreOffice
The second OSS project that has taken their ball elsewhere is the Hudson project. Hudson is a continuous integration server. What was once Hudson is now Jenkins CI. Basically Oracle overplayed its hand in this community as well and the members of that OSS community went elsewhere. The new project at: http://jenkins-ci.org/
From these two examples you can see the real power of OSS. If the sponsors of a project get out of line the community can take their code and talent elsewhere. Is this even remotely possible in the Microsoft ecosystem ? Let’s take a look:
When Microsoft decides to kill tools that you have adopted what can you do ? Pretty much nothing. Do you need a case in point ? OK here’s one for you: LINQ to SQL. In Oct 2008 Microsoft decided that LINQ to SQL would be deprecated. Here’s the
post:
http://blogs.msdn.com/b/adonet/archive/2008/10/29/update-on-linq-to-sql-and-linq-to-entities-roadmap.aspx
So what is a developer to do ? Pretty much it’s a single choice. Abandon what you are doing and move onto the next “Promised Land” of data frameworks. Yes you can maintain your current code base but do you think it’s wise to continue development on technology you know is dead ?
There is a real second choice but it takes a certain amount of bravery to do it. How about you abandon Microsoft when it comes to critical choices like data access and find an OSS project that meets your needs .
Instead of using ASP.NET/MVC how about taking a serious look at FUBUMVC (http://fubumvc.com/). FUBUMVC is an MVC framework built by a team of developers that really use it. This makes more difference than you might imagine. How can a company really understand the pain of their own frameworks if they don’t use them ? As an analog Ruby on Rails (http://www.rubyonrails.org) is highly functional as a web framework because it was and is built by developers that actually use it.
In this case I am picking on the ASP/MVC team a little. But I do commend that team for their behavior. The ASP.NET/MVC team is arguably the most transparent team at Microsoft. The ASP.NET/MVC framework is distributed under an open source license and you are free to download and modify the code as you see fit. Should the community decide to they could fork the code and follow their own path. In the case of ASP.NET/MVC it is nice to have choices.
It is up to us as a community to quit abdicating our responsibility to the mother ship. We need to do the right thing and take our destinies into our own hands and start supporting existing open source projects. Heck if it’s an itch you need to scratch you might just want to start your own.
ADIAD
Sorry for the repost… For some reason Community Server borked the original post for readers with IE.
Sometimes you are lucky enough to work for a company with a
good value system. Sometimes you are lucky enough to work for a company that
actually sticks to their value system.
My first real job was as a network administrator/database developer for Eagle
Crest resort in Redmond, OR. The CEO of the resort drove a car with a vanity
license plate consisting of the letters ADIAD. One morning I asked Jerry just
what that license plate meant. He said: “Rod, that stands for A Deal Is A
Deal”. During my time of employment at that company that value was
executed time after time. From that experience, that little acronym has stuck
with me for the last 20+ years. I have tried to run my life and business
according to those values.
Sometimes those values work in my favor. Sometimes they
don’t. Many years ago I worked on some courseware for a company. I offered up
my price for the courseware and the company accepted it. Later on I found out
how much another instructor was paid for equivalent courseware. I was a little
miffed, but I didn’t complain. I made a better deal the next time. You see I
got what I asked for. ADIAD.
A few years later fellow Los Techies blogger and my best friend John Petersen had
a similar situation. He took a small consulting job with another developer we
know. During the consulting gig he found out how much our developer friend was
paid for the job. It was significantly more than the compensation he asked for,
and received. John and I discussed the “fairness” of this situation. During
this discussion I asked him a question: “How much did you ask for? And
were you paid that amount?” The answer was yes. I responded ADIAD, next
time you will make a better deal. While the ethics or straighforwardness of the
other developer may be called into question, ADIAD.
I am telling you this story because over the last couple of weeks something
interesting has transpired. Two companies have made decisions to turn what were
once free products into paid products. The first company I will mention is Red
Gate Software. In 2008 Red Gate took over the maintenance of a product called
Reflector. Reflector is a tool that is used by .NET developers to decompile and
peer into the source code of .NET assemblies. Until 2008 Reflector was
maintained and provided to the .NET community for free. Until last week
Reflector remained a free product. Not anymore. In a few months Red Gate will
begin charging for the product. Now this would be no big deal if not for what
Red Gate said when they took over the product. Lutz Roeder and James
Moore (of Redgate) were interviewed by Bob Cramblitt of simple-talk.com (http://www.simple-talk.com/opinion/opinion-pieces/the-future-of-reflector-/).
In this interview James Moore of Red Gate said they would keep Reflector free
for the community. Mr. Moore said: “**_I
think we can provide a level of resources that will move the tool forward in a
big way. The first thing we are doing is continuing to offer the software
to the community for free downloading”_**.
This has proven to be a real issue for the community and Red
Gate. Red Gate is an active supporter of the .NET community (In the interest of
full disclosure as a Microsoft MVP, Red Gate provides us free licenses) and
this is not a good way of dealing with the community. It’s hard to put the
horse back in the barn but it could have definitely been handled better. Red
Gate should have subscribed to the ADAID principle. They might have bitten off
more than they could handle in supporting this as a free product. But they
should have stood by their word and kept the product free. If they didn’t want
to continue investing in the product they could have open sourced it. The PR
from that alone would have been very valuable to them. Now I have no idea how
much they paid for Reflector but sometimes you make a deal that you just have
to live with.
The second case of a free product going from free to paid is Pivotal Tracker.
pivotal tracker is an online project management system. A few weeks ago Pivotal
announced that they were moving to a paid pricing model. The reaction from that
community could have not been more different. The community responded
positively to the change. But there were also criticisms of the decision that
the company responded to. There were two interesting things. Pivotal gave the
community notice giving the community another 6 months of free usage. That is
enough time for companies to seek alternatives. The second item is that pivotal
would remain free for individuals. These two items were probably what helped
that community receive the change more favorably. I am not sure what deal, or
if there ever was one, that Pivotal made with its community. But the way they
handled this was much better IMO. To provide details regarding the history of
this announcement and subsequent changes, here are links to the original post and the
follow up:
http://pivotallabs.com/users/dan/blog/articles/1537-introducing-pivotal-tracker-pricing
http://pivotallabs.com/users/dan/blog/articles/1547-changes-to-pivotal-tracker-pricing
I am not sure how either of these cases will shake out but
each case can teach us all an important lesson. In life you are only as good as
your word. A Deal Is A Deal.
Commented Code == Technical Debt
Over the last month I have spent countless hours modifying code in our
development framework. This framework is nearing the 4th grade now (9
years old) and has had many sections modified or rewritten extensively
over the ensuing decade.
A common technique I used when changing code was to comment large swaths
of code and insert new code in its place. I can tell you now this
technique was a poor idea and has wasted a considerable amount of my
time. This may sound obvious to you but to some it might not. Here’s
what I experienced:
1) Every time found a commented block I was forced to perform a context
switch. Why did I comment this code? These context switches are the
equivalent of a phone call or a child coming into the home office to
play. It takes precious time away from the task at hand.
2) The commented out code takes up space on the screen and in the flow
of the code. This requires a mental filter and a minor context switch.
3) And finally it is like doing testing after the fact. If you find
yourself adding comments to your code: //REMOVE THIS CODE LATER you have
lost. Just kill the code now. You wont remember why you added that line
in the first place so just don’t do it. kill -9 that cruft now.
So what if you feel like your code is precious and needs to be kept
intact like it’s the Magna Carta? Just use your version control system.
The code will always be there to recover like the the Ark of the
Covenant. If you don’t have a version control system get one. There is
NO reason not to.
So now that I have your attention… kill that cruft. Otherwise I’ll submit your name to my new reality show: Code Hoarders
NOTE: For those of you that are code comment haters, this post is not
for you. Relevant comments are always a good ideas and should be used.
Consider this flame bait if you so desire 🙂
JavaScript Can’t Do Math – SilverlightCalculator A New Silverlight OSS Project
It’s amazing what a lack of sleep can do for the OSS World. Over the last year I have encountered numerous places where I wanted to do simple financial equations in my JavaScript applications. You know really complex stuff like adding and subtracting dollar amounts. Well if you have spent any time doing JavaScript work you soon realize that this is just a dream. Hence the creation of the SilverLight Calculator. This is a real simple application and is described by the contents of the README below.
The GitHub Repo for this project is: https://github.com/rodpaddock/SilverlightCalculator
You can also try this application at http://calculator.dashpoint.com/
This is my first OSS project and I would appreciate any feedback or changes you may offer.
README FILE
Welcome to the Silverlight Calculator.
This application was created for the simple reason: JAVASCRIPT CANNOT DO MATH
What this really means is that the basic numeric type in JavaScript is floating point.
Floating Point math is very difficult if not impossible to use when doing basic financial calculations.
Like Adding, Subtracting, Multiplying and Dividing dollar amounts.
This application solves this by creating a basic calculator that provides 4 basic calculations:
Add(number1, number2)
Subtract(number1, number2)
Multiply(number1, number2)
Divide(number1, number2)
This application is written in C# and deployed via a Silverlight Plugin.
There is a fully functional example in the WebSample Folder.
You can also access a demo of this application at http://calculator.dashpoint.com
Using this application is very simple. You just add an OBJECT tag in your web page pointing at the SilverlightCalculator.xap file.
Then using Jquery (or whatever JavaScript library you use) grab a handle to the plug in and call the operator you need for math calculations.
<object id=”calculator” data=”data:application/x-silverlight-2,” type=”application/x-silverlight-2″ width=”0%” height=”0%”>
<param name=”source” value=”Silverlight/SilverlightCalculator.xap”/>
<param name=”background” value=”white” />
<param name=”minRuntimeVersion” value=”3.0.40818.0″ />
<param name=”autoUpgrade” value=”true” />
<a href=”http://go.microsoft.com/fwlink/?LinkID=149156&v=3.0.40818.0″ style=”text-decoration:none”>
<img src=”http://go.microsoft.com/fwlink/?LinkId=108181″ alt=”Get Microsoft Silverlight” style=”border-style:none”/>
</object>
function getCalculatorHandle() {
// code to grab a handle to the silverlight calculator
var returnValue = document.getElementById(“calculator”).content.calculator;
return returnValue;
}
// sample call to the Add Function
function addNumbers(){
var factor_1 = $(“#addNumber1”).val();
var factor_2 = $(“#addNumber2”).val();
var calc = getCalculatorHandle().Add(factor_1,factor_2);
alert(calc);
}
This Week in Fail
Last week was a week full of fail. But not the type of fail you might think.
Last week at VSLive Redmond, Microsoft demonstrated a new application framework called LightSwitch. From what I read and saw, LightSwitch is a tool for building small line of business applications for Silverlight and WPF. These applications are constructed using a “point and shoot” interface that allows applications to be built and deployed rapidly.
And this is where the fail happened, but not the type of fail you might think.
The community failed in a big way. How did they fail ? The community FAILED by applying the immediate knee jerk anti-Microsoft sentiment that permeates the air today.
In the post LightSwitch: The Return Of The Secretary, Ayende Rahien, immediately criticized LightSwitch without (though he did disclose in his post) every laying his hands on the tool. He went as far as making suppositions about stuff that he claims will probably not work well. How can he make these assertions w/o real examination of the tool ?
Later I found another post from Donald Belcham: Microsoft.Data.dll and LightSwitch. In this post Donald talks about small line of business applications that eventually need to “grow up” and become “real” software projects. These applications are/were built in tools like Microsoft Access and Excel and served a critical need of business users in companies large and small. Yes these applications get built and they eventually might become part of the lethargic swamp that defines the standard of most IT shops today. He concludes his post with the statement:
“To the professional developers that read this blog (most of you I’m guessing), prepare to move your hatred and loathing from MS Access to LightSwitch.”
His sentiment might come to be a reality but how can such a bold assertion be made without ever using the tool first hand ?
And this is where the community failed. Some leaders of the community failed because they spouted criticisms of a tool they have no first hand experience with. The community at large fails because we continue to put up with un-intellectual kneed jerk reactions from our leaders.
As a community we need to have higher standards. We need to do our homework, criticize and comment from a position of intellectual pursuit and research, and finally we need to leave our bias at the door.
There is no substitute for face-to-face reporting and research.
Thomas Friedman
Using Fluent NHibernate With Legacy Databases
I am currently working on a project that has a requirement that it be able to access data from a legacy SQL Server database.
One feature of this system is the ability to add and store checking accounts. These checking accounts are used to make payments on customer accounts. When making a payment using a checking account the vendor needs two pieces of information: An ABA (American Bankers Association) routing number and a checking account number. An ABA number is found at the bottom of your checks next to your bank account number and can be validated against a list of valid ABA numbers. This is where I encountered a need to validate against a legacy database.
In our legacy system ABA Routing numbers are stored in a database called viplookups with a table called bnkroute
For this feature I created a domain object called BankInfo
public class BankInfo : BaseEntity
{
public virtual string AbaNumber { get; set; }
public virtual string Name { get; set; }
public virtual string City{ get; set; }
public virtual string State{ get; set; }
public virtual string PhoneNumber { get; set; }
}
Now I needed to map this entity to our legacy database. Our mapping for this feature is as follows:
public class BankInfoMap : ClassMap
So lets talk about the relevant Fluent Nhibernate features:
The first item of interest is the Schema() method. The schema function tells NHibernate to pull this entity from a specified database. In my case this database exists on the same SQL Server. So I didn’t need to try it on another server. If you have knowledge of this working on another server leave a comment here.
The next item of interest is the Table() method. This is pretty straight forward Fluent NHibernate and specifies the legacy table to pull your data from.
The next interesting feature is SchemaAction.None(). When developing our applications I have an integration test that is used to build all our default schema. I DONT want these table to be generated in our schema, they are external. SchemaAction.None() tells NHibernate not to create this entity in the database.
So that’s it. A simple combination of Fluent NHibernate features to access data from a legacy database properly.
Film Making: A Better Software Development Metaphor
The craft of software development is in dire need of a new metaphor. In this authors opinion the common “Manufacturing” metaphor we bandy about is tired and has bigger holes in it than a Round Rock Donut
The issue that bothers me most: manufacturing implies processes that delivers a real and tangible products. Manufacturing conjures up images of sprockets, cogs, gears, widgets and cars.
This is where I think using this metaphor is simply tired. At the end of the day we make something entirely different we make items that are neither hardened or tangible, but items that are “bendable”.
There is a better metaphor for what we do: Film Making. The items that we deliver from our process are more akin to movies, than cars rolling off a production line at Toyota or Widgets rolling off a line at Spacely Sprockets.
There are so many parallels in the movie industry it is astonishing.
Here are a few movie industry terminologies and how they parallel software development
I hope you can see from the aforementioned list that the process of software development shares many common attributes/processes with the film making process.
This post constitutes the first in a series. My plan is to take this list and create a number of other blog posts where we can all discuss how our industry could benefit from studying an industry that more closely parallels our own. I hope to answer the following questions:
Building a Rails Server
Fellow Los Techie ****John Petersen** took the famous (or infamous) **Nerd Dinner** **application created by Scott Hanselman and ported it to the **Ruby on Rails platform. When John was developing this, I recommended we actually put it online and proceeded to purchase the domain name www.railsdinner.com
Owning the domain name was one thing, hosting the site was entirely another. Last week I decided to find a place to host Rails Dinner. That place conveniently, is the server closet in my home office.
I had a spare machine not really doing much so I decided to appropriate it as a Linux server capable of hosting Ruby on Rails applications.
The Tools
Hosting a Rails application requires a number of moving parts. The moving parts I installed and their purpose are as follows:
Software | Purpose |
Ubuntu 9.10 Server | Linux server. |
Ruby (including Gems) | The runtime for ruby |
Ruby on Rails | MVC based web application framework |
MySQL | Open source SQL database. |
Passenger | Runtime deployment tool for Rails applications. |
nginx | High performance web server |
The Steps
Around the 3rd or 4th time of trying to install this server I decided to keep a log. The following tasks are what you need to do to get a Rails Server up and running.
Task | Purpose |
Download and install Ubuntu Server | www.ubuntu.com |
sudo apt-get install ubuntu-desktop | Optional step to add GUI support to your Linux server. |
sudo apt-get install ruby-full build-essential | Install ruby and all necessary libraries. |
sudo apt-get install rubygems | Install ruby gems distribution tools |
sudo gem install rails | Install rails |
sudo apt-get install mysql-server | Install mySQL Server |
sudo apt-get install mysql-client libmysql-ruby libmysqlclient15-dev | Install all libraries used to talk to mySQL |
sudo gem install passenger | Download all libraries needed to install passenger |
sudo apt-get install libopenssl-ruby | Install library required to compile nginx web server |
sudo apt-get install libssl-dev | Install library required to compile nginx web server |
sudo apt-get install zlib1g-dev | Install library required to compile nginx web server |
cd /var/lib/gems/1.8/bin |
Change to directory where nginx build files are located. </p>
NOTE: You may want to add this folder to your systems PATH settings</td> </tr> |
sudo ./passenger-install-nginx-module |
Downloads code and compiles (yes compiles!) custom web server with passenger built in. </p>
NOTE: This command is finicky and took me a few tries to get it to run properly. NOTE: nginx is a high performance web server that is used by a lot of major web sites. It does not have a module/plug-in architecture so passenger is compiled directly into the server code. Web Server ConfigurationAfter your web server compiles you need do add a script to your system to facilitate the stopping/starting/restarting of the nginx web server. I found the script (and instructions) below at the following site: http://library.linode.com/development/frameworks/ruby/ruby-on-rails/nginx-ubuntu-9.10-karmic #! /bin/sh
### BEGIN INIT INFO
# Provides: nginx
# Required-Start: $all
# Required-Stop: $all
# Default-Start: 2 3 4 5
# Default-Stop: 0 1 6
# Short-Description: starts the nginx web server
# Description: starts nginx using start-stop-daemon
### END INIT INFO
PATH=/opt/nginx/sbin:/sbin:/bin:/usr/sbin:/usr/bin
DAEMON=/opt/nginx/sbin/nginx
NAME=nginx
DESC=nginx
test -x $DAEMON || exit 0
# Include nginx defaults if available
if [ -f /etc/default/nginx ] ; then
. /etc/default/nginx
fi
set -e
case "$1" in
start)
echo -n "Starting $DESC: "
start-stop-daemon --start --quiet --pidfile /opt/nginx/logs/$NAME.pid
--exec $DAEMON -- $DAEMON_OPTS
echo "$NAME."
;;
stop)
echo -n "Stopping $DESC: "
start-stop-daemon --stop --quiet --pidfile /opt/nginx/logs/$NAME.pid
--exec $DAEMON
echo "$NAME."
;;
restart|force-reload)
echo -n "Restarting $DESC: "
start-stop-daemon --stop --quiet --pidfile
/opt/nginx/logs/$NAME.pid --exec $DAEMON
sleep 1
start-stop-daemon --start --quiet --pidfile
/opt/nginx/logs/$NAME.pid --exec $DAEMON -- $DAEMON_OPTS
echo "$NAME."
;;
reload)
echo -n "Reloading $DESC configuration: "
start-stop-daemon --stop --signal HUP --quiet --pidfile /opt/nginx/logs/$NAME.pid
--exec $DAEMON
echo "$NAME."
;;
*)
N=/etc/init.d/$NAME
echo "Usage: $N {start|stop|restart|reload|force-reload}" >;&2
exit 1
;;
esac
exit 0
</div> </div> Copy this file to a file called “nginx” in your /etc/init.d folder Run the following code to make the script runnable and start your nginx web server when Ubuntu launches.
chmod +x /etc/init.d/nginx
usr/sbin/update-rc.d -f nginx defaults </div> </div> Now you can start your server with the following command /etc/init.d/nginx start The last step is to drop your code into a folder on your server and configure nginx to use it. When you create the folder where you plan on installing your rails application make sure to use the sudo command. The sudo command insures that nginx and passenger can run your code. The following two lines create a directory for your application code.
cd /home
sudo mkdir www.railsdinner.com </div> </div> Now copy your Rails source code into this folder. Finally you need to add a configuration setting to your nginx configuration file (cd /opt/nginx/conf/nginx.conf) . Add the following code inside the http{} brackets in your .conf file. server {
listen 80;
server_name railsdinner.com www.railsdinner.com;
root /home/www.railsdinner.com/public; # <--- be sure to point to 'public'!
passenger_enabled on;
} </div> </div>
Finally Rails Dinner!Now simply restart your nginx server and you are off to the races. /etc/init.d/nginx restart Go check it out for yourself www.railsdinner.comThanks!Thanks to John Petersen for creating this application. It was a great learning experience and it took his work to inspire this endeavor. |
subscribe via RSS