Recommended: Sing it, brah! 5 fabulous songs for developers
JW's Top 5
Optimize with a SATA RAID Storage Solution
Range of capacities as low as $1250 per TB. Ideal if you currently rely on servers/disks/JBODs
Computer science emphasizes two important topics: datastructures and algorithms. Those topics are important because the choices you make for a program's datastructures and algorithms affect that program's memory usage (for datastructures) and CPU time (for algorithms that interact with those datastructures). When choosing a datastructure or algorithm, you sometimes discover an inverse relationship between memory usage and CPU time: the less memory a datastructure uses, the more CPU time associated algorithms need to process the datastructure's data items, which are primitive type values or objects, via references. Also, the more memory a datastructure uses, the less CPU time associated algorithms need to process the data items—and faster algorithms result. This inverse relationship appears in Figure 1.

Figure 1. The inverse relationship between memory usage and CPU time
An example of the inverse relationship between memory usage and CPU time involves the one-dimensional array and doubly-linked list datastructures, and their insertion/deletion algorithms. For any given list of data items, a one-dimensional array occupies less memory than a doubly linked list: a doubly linked list needs to associate links with data items to find each data item's predecessor and successor, which requires extra memory. In contrast, a one-dimensional array's insertion/deletion algorithms are slower than a doubly linked list's equivalent algorithms: inserting a data item into or deleting a data item from a one-dimensional array requires data item movement to expose an empty element for insertion or close an element made empty by deletion. (I explore one-dimensional arrays later in this article, and doubly linked lists in next month's article.)
This article initiates a two-part series that explores datastructures and algorithms. The article begins with a presentation of basic concepts and continues with a tour of the array datastructure. You learn about one-dimensional, two-dimensional, and ragged arrays, plus linear-search, bubble-sort, binary-search, and matrix-multiplication array-oriented algorithms. The article ends by asserting that Java's arrays are objects.
Note: You can download the source code that accompanies this article from Resources.
Before we explore specific datastructures and algorithms, we need to examine three basic questions: What is a datastructure? What is an algorithm? How do you represent an algorithm? Knowledge of those concepts helps you understand this series.
Datastructures have been around since the structured programming era. A definition from that era: a datastructure is a set of types, a designated type from that type set, a set of functions, and a set of axioms. That definition implies that a datastructure is a type with implementation. In our object-oriented programming era, type with implementation means class.
The datastructure is a class definition is too broad because it embraces Employee, Vehicle, Account, and many other real-world entity-specific classes as datastructures. Although those classes structure various data items,
they do so to describe real-world entities (in the form of objects) instead of describing container objects for other entity
(and possibly container) objects. This containment idea leads to a more appropriate datastructure definition: a datastructure
is a container class that provides storage for data items, and capabilities for storing and retrieving data items. Examples
of container datastructures: arrays, linked lists, stacks, and queues. (I will explore the linked-list, stack, and queue datastructures
next month.)