7 cutting-edge programming experiments worth trying

Get the best from trending technologies like Erlang, Node.js, and Go

The words cutting edge may be crisp and definite, slicing through air like a knife in a bar fight. But few things strike fear in the minds of enterprise IT like the claim that a new product is built by a team working on the "cutting edge" of technology.

The problem isn't aversion to the new or being too old to change. After all, enterprise teams need all the best new ideas that come along -- even before they're completely done. But they also need all the stability and certainty that the old faithful stacks of code bring.

The trick to making the most of the cutting edge is to experiment, not to jump in with both feet. Try your code where it can be most effective and see whether the cutting-edge tool offers the performance and features you need. Then check to see whether you're trading off anything essential. Experiments don't always reveal the hidden weaknesses or trade-offs immediately. So work in increments, and when your experiment is fully vetted, move to a more serious implementation.

Here are seven experiments with the "latest and greatest" worth undertaking today. None of technologies were invented last week -- in fact, some are more than two decades old. But they're attracting significant attention from the leading edge for what they can offer the modern enterprise stack today. Try them out. They're good places to begin moving beyond the trusted worlds and trying something faster, simpler, or cleaner.

Cutting-edge experiment No. 1: Erlang for clean concurrency

The cutting edge exists largely to solve problems -- to scratch an itch, as they say. The cloud exists because managing a data center is a job full of headaches. New languages like Erlang came along because the old ones weren't up to the job.

More than 20 years old, Erlang is gaining new fans because it offers effective structures for minimizing the trouble of running concurrent threads. Web servers juggling multiple users concurrently are more likely to behave correctly if they're written in Erlang because the language is designed to help programmers make the right decisions by limiting the way their programs are written. A genius could do the same thing with any language, but Erlang offers a set of safety rails in its design to keep the threads from getting as tangled. The language pushes functional design and message passing to sharing variables that need to be locked and unlocked. This makes creating an enterprise cluster that handles many users far simpler.

Erlang was developed by Ericcson for its internal telecom systems before releasing it as open source. The larger community that arose around Erlang since then now offers support tools for installing the basic runtime for most major operating systems, and there's a large collection of open source projects. Many use the OTP or Open Telecom Platform as a foundation for supplying a data service like a website, which most simple projects begin with. (See erlang.org for downloads.)

There are limits, though, to what any language can do. Many of the problems that new tools are designed to fix aren't the result of ignorance or failure; they just represent different decisions. Think "trade-offs" instead of "problems."

The cloud, for instance, may offer simplicity and flexibility, but it provides this in exchange for control and security. Erlang programmers trade some of their freedom for this new model. If your code deals with many isolated users who don't need to interact directly, it's simple to write your code in the Erlang model. But if you need your threads to communicate -- and think you have the chops to make it work -- Erlang makes your life harder.

Start small and find out whether your proposed use of Erlang is on the right side of the trade-off or the wrong one.

Cutting-edge experiment No. 2: The Node.js Web stack

Many enterprise services are judged on how quickly they deliver data. No one wants to keep a potential customer hanging around watching some blank window on a browser. Even fewer people want to keep their bosses waiting for some crucial report or analysis of the business.

Some cutting-edge tools are designed for speed. Node.js, for instance, is popular because it runs very quickly. It can be even faster when paired with one of the newer NoSQL databases, which offer incredibly fast results on saving data. Together, it's possible to build a fast Web infrastructure on a small platform that, as a side effect, also consumes much less electricity. Speed and energy efficiency are often linked.

The speed should be attractive to companies looking to put a priority on responding quickly. Some of the more ephemeral websites never want to disappoint or let a potential user wait very long. Companies with captive clients -- say, banks -- may want to make different decisions.

Node.js is an open source stack built on top of the Chrome V8 JavaScript engine, but most people will begin with a prebuilt executable from nodejs.org for all of the major platforms. Joyent, the major sponsor, also offers cloud machines with images, including all of the necessary libraries and tools.

Many developers head straight for Web frameworks like Tower, Geddy, or Railway, each of which simplifies the work of building a basic, data-driven website.

The trouble with Node.js is not with performance, but with the weight put on the programmer's shoulders. Smarter programmers need to be more careful because the entire package runs in one process. If one user tosses a curve ball that hits a bug in your code, the entire Web server could lock up. Good programmers and extensive testing can avoid this, but no one is perfect all of the time. It is a polar opposite of Erlang because it offers few limits to keep programs from going awry.

The Node.js-NoSQL combination exposes a driving force for today's cutting edge: a focus on supporting the explosion of interest in social networks. If you're thinking of experimenting, find a place where you can afford to be fast but not careful. If your data needs careful curating, you might want to avoid these dangers.

Cutting-edge experiment No. 3: HTML5 Web and mobile apps

The new broom sweeps clean, the old saying goes, and so do new tools. The latest languages and software stacks that are built from scratch are not larded up with multiple revisions and deprecated APIs. The syntax and format are simple and uncluttered.

This usually produces cleaner, simpler code. While programmers can write convoluted code in any language, the newer stacks often require less extra glue code and version testing. Some of my code for smartphone apps goes through dozens of version tests to make sure it's doing the right thing for the right version. New stacks don't have this extra complexity.

There are dozens of new HTML5 projects that handle many of the basic details of creating a website or a mobile phone app. The code, which is often called a framework or a scaffolding, organizes the content in pages and offers a transition mechanism ruled by menus. Some of the most popular are jQuery Mobile, Sencha Touch, and Titanium, but a number of other tools are emerging. Many of the most popular CMS stacks like WordPress or Drupal sport themes that are tuned to the mobile environment and often use some of the same code.

While these new code stacks are clean, they often achieve this by tossing aside old platforms. It's easy for new tools to let people write simple, elegant code. They just ignore the older hardware and the older versions of the operating systems. VoilĂ ! Of course they're simpler and faster because they only work with the pre-release code shipping at this moment.

The glitches with the HTML5 frameworks start appearing if you use an older browser or one that's not as standards-compliant. Suddenly, the menus start appearing in weird places and half of the text is off because the CSS instructions don't work. Sometimes the new needs to get along with the old, and it's a problem when the new code insists that it can only solve things one way.

Before you launch an experiment in this area, know where you can afford to support a subset of technologies out there.

Cutting-edge experiment No. 4: Chewing up data with R

From cleaner Web design to more sophisticated analysis of big data, the R language lies at the core of some of the most popular new tools designed to use math to solve problems and take care of customers. The collection of tools around R is more than just a language with predefined functions for common statistical formulae; they're entirely new ways of thinking about the problem and finding a solution.

The statistical models inside big data analysis packages, for instance, can suss out and flag complex patterns and take advantage of all the power a modern cluster of computers can deliver. They replace the old mechanisms that would simply sort or look for maxima. Working with cutting-edge statistical software means you can do deeper analysis and find signals when the old code just saw noise.

When these new insights appear, they can save businesses billions of dollars. They help stores detect local tastes and ensure that the shelves are better stocked with the colors, patterns, and sizes that are demanded by the people in the neighborhood. They offer marketing engineers the opportunity to do a better job at guessing how much advertising is enough. Anywhere there's data, there's a chance to find siginificant insights.

R, the language, is distributed through an open source project devoted to nurturing the core. Many developers start with more complete IDEs like R Studio that bundle together editors and output windows with the execution engine. The IDE is the best way to create code that can run on just the core when it's deployed into production stacks.

The trouble with statistical tools like R is that the insights don't always come, and what comes of the experimentation isn't always significant. Just because the thinking is newer doesn't make it better. Big data offers perfectly good theories and even great ideas, but few know just how good they are -- especially in context. Will this kind of statistical analysis really help your product? Will the incoming data have enough precision to allow the theory to work? No one knows, but you might find out if you devote several months of experimentation.

Consider the excitement about using statistical tools like R to slice through the mounds of data piling up in your disk farms. Perhaps you're the lucky one who has data filled with one very strong signal just waiting to be discovered. Most folks find that data mining requires plenty of human intelligence to discover the crucial insights that are buried in the noise. A quick dive into the numbers just yields confusion.

Cutting-edge experiment No. 5: Tapping the speed of NoSQL

Let's face it: We programmers are a lazy bunch. We won't start building something from scratch unless we need to. New tools are usually built around one big new feature. Sometimes there are even more.

The only way to get these features is to embrace these new tools. Many of the new NoSQL databases slip effortlessly into the cloud. They see a rack of machines and work well across all of them. That's why they were built and what they do well. They wouldn't exist if they weren't needed.

There are a wide collection of NoSQL projects that offer slightly different collections of features, and enumerating them and explaining the differences between them is beyond the scope of this article. A few of the more popular tools are Cassandra, MongoDB, CouchDB, and Riak. Some companies are also offering the tools as services. MongoLab and MongoHQ are two that offer to store data using MongoDB. Similar versions are available for all of them.

The ability to respond like lightning and scale almost as quickly are great features that may be worth rewriting all of your code to take advantage of, but one of the reasons these seductions of the cutting edge seem so great is because we haven't felt how they can go wrong. There's usually a dark side, and it often takes a bit of time to discover it -- often by mistake.

The same issues confront NoSQL databases. They're fast, but mainly because they don't offer any iron-clad promises of consistency. They suck up the data and respond with an "All Clear" before they're sure that the data has been written to disk. This may be adequate for many of the websites that traffic in social gossip where a lost status update means little, but it's not ideal for others.

Find a spot where you can afford to play without reservations and begin to tinker with a few of these key-value datastores.

Cutting-edge experiment No. 6: Finding connections with graph databases

The idea of a database was well defined in the last century. You define a table with a list of columns that hold particular data, then insert rows into the database until it's full. The columns might hold integers, decimal numbers, or strings, but that's about all of the flexibility you get.

The graph databases like Neo4j are a new twist on the idea. You still stick your numbers and letters in columns, but now you can create pointers between the rows that form networks. If you're storing a social network, the database is ready to record who is friends with whom.

1 2 Page
Recommended
Join the discussion
Be the first to comment on this article. Our Commenting Policies
See more