The entire studio ethernet ran on a standard 100Mbps line, 100BASE-T Fast Ethernet, which used to be enough. However, with the addition of a render farm and a fast centralized storage, I needed an upgrade. Thankfully, nowdays, a Gigabit Ethernet is becoming pretty mainstream as well, so I didn’t really have to put too much money into the whole network. All the standard, mainstream, main boards come with a 10/100/1000Mbps network cards integrated, the 1000BASE-T switches and routers are also pretty cheap, so all I really needed was a new switch, a bunch of CAT6 cables (however, CAT5e would have been enough as well) and a bit of re-wiring. The new network topology can be seen at the top.
Hell yeah! After a few months of putting the gear together, getting all the paperwork done, installing the electrical and network cabling, we finally run our own data management platform with an added bonus of a private render farm. How cool is that?!
I finally started building the render slaves for my studio. The first dedicated render node I built is based on basic mainstream parts, nothing fancy, but with enough power so that the render node does make sense to be placed in a rack installation.
The basic idea, obviously, was to build as powerful a machine, as possible for the lowest price tag, as possible. Since I’ve been an Intel user since, well forever, I based the machine on a Core i7 860 (Lynnfield) CPU, DDR3 memory and the rest is pretty much optional. But for my purposes, I wan every machine in the studio, to basically follow this idea of having a dedicated hard-drive, preferrably pretty fast, for the OS and a dedicated one for all the offline data. So, each machine, including the render nodes, will host a C: drive with all the software and programs on and a D: drive that’ll be setup to support all the files that we’ll work with. The workstation will have some other HDDs optionally, but these two drives are neccesary in order to rule out variables in the pipeline I’ve been building for a few months now.
The T-Systems guys were pretty quick! I didn’t expect them to show up this week, but they did. Kudos!
Anyways, the server room is finally plugged to the local central switch. It is not online yet, T-Systems will have to go through yet another buerocratic procedure prior to setting up the line. But, the hard work is done and all I need now is an electricity revision and the T-Systems green light.
We finally got to the first stage of installing and prepping the server room this weekend. The first stage was to get electricity to the computers. The problem with this is, firstly, the power drain, then the network connection. Unfortunately, I didn’t have any other choice but to place the server with the DAS and the render slaves in a storage room, located on the ground floor in the building my studio is placed in (former flat). The room is great since the server and the running machines don’t bother anybody, but, it’s not properly air conditioned, it didn’t have any electricity power (except for the light) and it wasn’t connected or even remotely being able to be connected to the LAN switch.
It is sad, but the promising concept of Containers introduced in 3ds Max 2010 is lacking fundamental functionality. It’s lacking so much I can hardly see a benefit in using them instead of XRefs.
duber studio and its projects (duber.tv, this blog, mycirneco.com, chargethedragon.com and various other hosted services) have today successfully migrated all the databases to its own server, Dell PowerEdge. We’ve been running our own server since July 2007, however, most of our databases were still being kept over at our hosting partner, Hokosoft s.r.o., which still manages our server housing and maintenance. This move means a bit faster db transfers (most likely not noticeable to the visitors of our sites), but in terms of administration and management, it’s a much welcomed move as we can now start to integrate our vfx pipeline and management tools to a remote server for faster and smoother collaboration.
There are a few software packages I stumbled upon or have been using for ages which are, at least to me, of a much higher value that what they’re being sold for, which is always a great thing! Here are four I picked from a number of programs I use daily in my work or even for fun and personal enjoyment. It is rare in a high-end DCC world for an app to be worth more than its price tag. Don’t get me wrong, they all are worth their fee, however, at least with some I feel the app delivers more, in a better way or faster or what have you, than it’s supposed to or the developer markets. Here’s my pick for todays encomium: