The Smart Approach to Maximizing Digital Resources
There is an ongoing discussion in the tech community about which language scales the best, what server can bear the most load, how best to tune your database and maximize the performance of your code. Here at Cuker Interactive, we pay close attention to all of these things, but many of these choices and decisions are secondary to the most powerful tech asset: common sense. Today we are going to share a bit of the common sense we have applied to the performance of our content management systems and web applications.
As veterans of the web, we are quite familiar with the boon and curse of a sudden spike in traffic. On one hand, the marketing department cheers, as for them it is a job well done. The tech department, is usually the one to groan, for they know that if the spike is of sufficient magnitude, then the cheers will soon turn sour.
In the software world, everything requires tradeoffs, and so naturally this applies to hosting capacity. If you want to have the ability to take millions of hits a minute, you will have to shell out quite a bit. The secret answer here is to accurately anticipate a very significant load spike, and to plan accordingly.
This is where the common sense comes in. Conventional wisdom was that to handle more traffic, you needed more power. While this still holds true, it is far more important, and cost-effective, to analyze the amount of work your server has to do to successfully serve a page request. Here are some simple tricks that we use to keep the power required per request at a minimum, and thereby keep our clients communicating there message and selling their products under the most critical moment: a surge of traffic.
First up, caching. For the vast majority of dynamic, database driven websites out there, there is no need to re-generate most pages on every request. You can have the flexibility and dynamism of a Content Management System, with the speed and power of a basic HTML site if you utilize a caching system to store pre-generated pages at regular intervals. That way, the work to generate the page is only done once every, say, 5 seconds. If that page is being hit 5,000 times a second, that means you get to serve 24,999 pages at a fraction of the cost (we are talking about something on the order of thousands of times cheaper).
Thirdly, keep an eye on your application's database usage. Most web applications spend the vast majority of time on each request in the database. Keep your eyes out for anything that could be doing too many database queries, especially expensive things like sorts and groupings. If you can optimize your database to do less queries, you gain a ton.
Fourth, utilize a Content Delivery Network. When thousands come every second to view a video, the last thing you want is for all of them to stay on for thousands of seconds to download that video. Best let them show up, get the page HTML, then download the video from somewhere optimized to deliver it. This lets your application server stay fast and light, and keeps movies playing smoothly. Our Content Management System of choice, Web Cube, automatically pushes videos to a CDN in order to ensure that bandwidth costs are kept down, and the site is kept up.
Finally, try to do things when your content changes, not when it is viewed. In the world of Content Management Systems (as opposed to other web applications), your data often changes very slowly, on a minute by minute scale. Thus, it makes sense to do a little prep-work when your content changes, as there will be people viewing your content second by second.
All of these strategies are best practices, and we follow them enthusiastically here at Cuker Interactive. A fundamental component of great design is the speed in which it is delivered. Central to the Technology Department here at Cuker Interactive, is to maximize speed, reliability, security and flexibility, so that we can enable our brilliant designers and astute marketers to deliver you the results you deserve.