I’ve been working on utilizing Chef from Opscode lately. One of the awesome features is the ability to create templates. A template of a file utilizes ERB for any customization. But what if you are creating template of an ASP page? ERB and ASP use very similar code tags. The following example will cause an issue and chef will fail:

<%hdr="<%= @site_name %>"%>

My recipe is only passing in a variable of site_name. Chef has no clue how to inperpret anything else and the erb template creation fails. There’s a way around this:

<%%hdr="<%= @site_name %>"%>

Add the extra “%” so “<%%” will ignore the ASP code tag. And it will do exactly that, escape the <%, but what one needs to know, is that this will not produce expected results. In this case the entire line is processed and ignored until the closing tag. So the result after template creation is:

<%hdr="<%= @site_name %>"%>

To correct this one would hope that simply adding the closing tag the statement as such:

<%%hdr="%><%= @site_name %>"%>

After which, the template will be processed,,, kinda….:

<%hdr="%>The Botched Network" %>

NOT what I want it to do. After mucking around with this for a long time, I’m at a loss. I ended up storing the entire line that I needed into a variable. I relinquish my ERB learning for now…

Sources:

y u no worky

I’ve recently been tasked with providing reports on a regular basis that show the status of our servers and services that we monitor using nagios. We’ve devoted a rather large amount of time to providing nagios with many checks and graphing of as much data as possible, and making custom plugin’s wherever we see fit. Nagios has been fantastic thus far with it’s flexibility and ease to customize. Though daughnting for the new commer, nagios has proven to be a very valid candidate for enterprise worthy monitoring. That is, until I discovered an interesting problem that I cannot find a resolution to.

Lets roll through this by looking at a standard host check:

host_check_current

As you can see from this screenshot the host is clearly up. In fact, the last status change was 10:17AM on the 28th (planned server reboot). With this we can devise that as of right now the server is up with a status of OK. So lets run an availability report.

host_availability_report

As you can see from this report, the server appeared to go down on the 25th. There are no entries of the server having gone down on the 28th due to a reboot. There are no entries stating the server came backup on the 28th.

With this information we can derive two things. One, something is wrong with nagios somewhere. Whether it is our specific configuration or nagios was lagging and missed some data, somewhere he didn’t log the fact that the server has cycled a couple of times at all. Two, there is a disconnect between how nagios stores the data that shows that a server is online and how nagios pulls the data to provide us the reporting. I would expect the second to be true regardless due to the method of capturing this data. The problem I have with this, is that despite the fact that the server is OK RIGHT NOW, nagios’s availability report proves otherwise. This is causing a rather large negative impact on our reports.

Note that we do have a parenting structure in place so we decided to create a mock setup with another server with one parent. We fakingly brought down the fake parent, followed by the fake server, to fully recreate what happens during such an outage. The systems were brought back up at the same time. Nagios reported all systems coming online just fine. Our availability report looks as it should. But when proceeding to this fix for other hosts, it didn’t help out the issue.

In our case we had stopped the service, removed the performance data that the reports utilize and restarted nagios. We kept our graphs, but nagios will simply no longer have the capability to report after this period of time. A bummer, but I want my reports to be correct for the time being. I’ll be looking into this deeper in the future.

Yes.

I use cygwin.

My daily job entails me to manage both windows and linux machines on a regular basis. The environment that I work in is mostly windows machines, however there are a small number of our systems that do run linux. For basic administration, you’d think PUTTY is the way to go, simply log in and run with it. But from a standpoint of security, and to not interfere with the performance of a production system, you really ought not log into any production server. As an avid linux user, I have a decent sense of awareness as to know when doing X might effect Y. But I’m not the greatest of typists. You never know.

So I’ve decided to do basic linux tasks from my own personal machine. Having cygwin may be one of those insanely convoluted ways of running linux commands and tasks while on a windows machine, but there’s one massive benefit from which I like it so much. I can mess with files on a windows machine with ease. Plus, I can use PUTTY to log into the SSH server on my machine. You don’t realize how much hatred comes with the use of the windows command line. Granted I know nothing about Power Shell. Go figure…

So then at this rate, why not create a virtual machine and run it inside of windows?  Performance. Why not use a linux as the main system and run a windows virtual machine?  Performance. I know that may sound silly, but as a power user, there are way to many tasks or applications that I need to manage quickly. And having the admin kit come up very quickly for any service helps me out. I’m not a fan at all of running VM’s on my business machine simply because I am a power user and I will starve the resources from my machine way too quickly. Now if they would simply upgrade my RAM,,, perhaps.

I also don’t know of a solution that offers the ability to run a linux box with a virtual machine platform with the ability to manage the files on the physical box. VMware Workstation might, but it’s hefty, requires a license, and running it’s services are rather taxing on my puny little workstation. As far as I know qemu and VMware player do not contain the capability to share files within each other. The only solution that I do know of, is Parallels, but I don’t run an Apple at the office.

Cygwin I utilize lightly, and has the minimalist features that I need to be able to what I need to do for my environment. I can login from a server to my machine to pick up files or check something out. I can scp something from my machine to a server. And as another example, I could run wget on a website to a production or another linux server…  But I run accross many websites that the link to download something, is a javascript, or php redirect. Download it to my desktop, use cygwin to scp it on over. Convoluted?  Yes, I agree. Though this make my life a bit easier. Since I’m also a perl user, this allows me to test scripts a bit easier too. With cygwin you have full access to the windows environment. Any command that works in windows, cygwin will pull off with great effort. If I installed a ruby environment on my system, cygwin will be able to rock it out.

If I had any complaint, it’d be the slashes. I use pstools greatly, and this is actually one of those programs that simply doesn’t work out so hot due to the nature of the back slashes. By no means is anything perfect.

I do have my own test machine that runs CentOS, however, I do like having cygwin around at all times. I’m not trying to sell cygwin, I’m just letting you know that I like it. At least I’m not this bad.

Here is how to set up cygwin’s SSH to run as a service. Note that if you run firewalls, you may need an exception added, as well as should you choose to connect from a different machine.

So today was once again another beautiful day here in the city of Annapolis. Yesterday was bright, sunny, warm, with a touch of heavy wind. Today, even better with just a slight touch of breeze. I chose the time to take myself and explore one of the parks that is around this city. I cruised through Quiet Waters Park. A nice sized park with plenty of parking in different areas and a couple of areas in which the onion grass rustles in the wind. There are a couple of gazebos and shelters for huge get togethers as well as a nice kids playground for all the youngsters to gather in groups to challenge the parents ability to keep thier children from having too much fun.

Something that sets this park off is the fact that there are no sporting grounds at all. So it is more for the families at heart that just want to get away from nearly everything as well as the retired who don’t want to deal with the sweaty bodies of humans running around. During winter they have an ice skating rink.

I dropped by Potbelly’s to pick up a fantastic lunch. A cute girl works there, she did step two of the sandwich process. They have a really cool website. I then floated on over to Art Stuff Inc. They have a not so cool website, but are in a cool location. Apparently I’ve left all my drawing stuff at the crib back in VA. Bummer. I then drove around the park, outside of it checking out the neighbourhood. I already forget what the neighbourhood is called, but they have a private beach. And the shoreline is close in general making a very nice view for some.

I then turned into the park, found my place to park, then parked my butt on a towell and delved into reading LDAP; something I’ve been doing since I’ve started my new job… I then whipped out my creative side and picked up some sun on half of both my arms while drawing out some random picture. It’s not finished. But when it is, I’ll throw it up here. Then I can excersise my ability to have graphics on this site! MMMMM the posibilities of having more stuff. Until next time. Have some fun in the sun.

TorrentFlux is possibily easier to install than a torrent client on any users computer. Seriously!

apt-get install torrentflux

Within 45 seconds torrentflux will have installed, configured, and restarted apache. Simply browse to:

http://localhost/torrentflux

after the install and POOF. Finish the configuration. Unless something didn’t install properly, everything is well more than dandy. In my case I changed the default location in which all torrents are saved due to the strangeness of the hard drive setup in my system. By default torrentflux is set up to download to the /var/some/directory. I have it download everything in the /home/something directory. The very first option on the admin page will allow you to choose a different directory. The only mandatory thing is to make sure that directory has the 777 permission.

chmod 777 /directory/name/goes/here

The default directory has already been setup like this.

TorrentFlux uses bittornado by default, rolling out perl for use, it starts a process for each torrent that is to be downloaded. That process obviously stays up until the torrent is to be cleared from the queue. That queue includes whether the torrent is being downloaded or is seeding.

Probably the only bummer of doing torrent downloads this way, is that it isn’t on the computer in which I use the media. However, it helps me out when I move files around, because all of my media stays on the server, and is just streamed through the network whenever I want to take advantage of viewing said stuff. YAY TORRENTS! Peace out peeps.