Optimize with a SATA RAID Storage Solution
Range of capacities as low as $1250 per TB. Ideal if you currently rely on servers/disks/JBODs
While DS&A can be discussed in purely implementation language-independent terms, these intimately related topics have generally been brought to the programming and student masses in language-specific texts -- Pascal-flavor in the '70s, C-flavor in the '80s, and C++-flavor during most of the '90s. If there's any indicator that more clearly supports the rumors of Java replacing C++, it's the fact that these all-important books are now Java flavored.
Available at the time of writing were the following books that deal with algorithms, data structures and/or abstract data types (ADTs):
Here's a table to provide you with a bird's-eye view of each title's main characteristics. The table orders the books by price.
|An Introduction to Data Structures and Algorithms with Java||Java Algorithms||Abstract Data Types in Java||Data Structures and Problem Solving Using Java||Data Structures and Algorithms in Java|
|445, 15 (1)||484, 15 (0)||291, 12 (2)||780, 23 (4)||738, 16 (2)|
|Listings Density (lines/page)||49||41||52||51||43|
|Big-Oh Algorithm Analysis?||Poor||No||No||Thorough||Very Thorough|
|Algorithm Animation Applets?||Some||No||No||No||On Web site|
|Focus on Abstract vs. Implementation?||No||No||No (!)||Yes!||Yes|
|JGL or STL Discussed?||No||No||No||No||No|
|Math Sophistication||None||University level||None||Higher secondary level||Higher secondary level|
|Suitable for University course?||No||No||No||Yes||Yes|
|Overall Score out of 10||4||6||6||8||8.5|
* Strictly speaking Yes, but the CD-ROM content is unrelated to the book's content!
In the absence of tools to calculate the true cost-per-bit equivalent of a book, the Listings Density row gives you an idea of how dense or "aerated" the source listings are. Low lines/page values usually mean unreadable listings and a high page-fill factor, so the higher this value, the better.
The Big-Oh Algorithm Analysis? row indicates how thorough the text is when it comes to mathematically analyzing the runtime performance of a given algorithm.
The Algorithm Animation Applets? row indicates whether the authors grabbed the staring-in-your-face opportunity to enhance the text with animation applets
that graphically show algorithms in action. (Judging by how few did grab the opportunity, I wonder if those authors ever saw
the JDK's very own
SortDemo demonstration applet?)
The Focus on Abstract vs. Implementation? row highlights whether the text draws a clear line between the definition of the abstract data type (ADT) and the near-infinite number of ways to implement that ADT.
The JGL or STL Discussed? row highlights whether another very obvious opportunity was grabbed: the opportunity to take an in-depth look at ObjectSpace's Java Generic Library (JGL), or, at least, give an overview of the philosophy and design behind JGL or JGL's C++ ancestor -- the Standard Template Library. (I didn't include a row that covers the new 1.2 JDK Collections API because this API hasn't been finalized yet, and in any case, none of the books even mention it.)
The Math Sophistication row should give you an idea of how strong your stomach will need to be when it comes time to wade through the respective authors' use of mathematics to analyze algorithms and/or prove theorems relating to a given algorithm.
For a book that's squarely aimed at university courses, I found Rowe's An Introduction to Data Structures and Algorithms with Java unacceptably weak. Because educational establishments should teach students sound practices based on conventionally accepted norms, it's disheartening to see a book that implants in students "knowledge" and "techniques" that range from questionable to downright counter-productive. My main criticism is that this book has no intellectual muscle to it: The author seems to think that computer science students cannot handle problems beyond a certain, unrealistically low, level of complexity. This is a fundamentally flawed assumption, and no way to groom the software engineers of tomorrow, who will have to tackle application complexity that will dwarf anything from the present and past.
One of the perceived mental obstacles that the author avoids like the plague is the (simple) math necessary to formally analyze an algorithm's running time characteristics. So classic Big-Oh reasoning goes out the window in favor of an experimental, quantitative approach that seeks to determine the running time Big-Oh signature of an algorithm by interpreting plots that map input size to runtime. The author does this very consciously, as he explains in his Preface:
The efficiency of the algorithm is then deduced from graphs of the results of these experiments. Although such an approach is certainly not rigorous and will probably horrify mathematicians, the author believes that it is a much clearer way of presenting the topic of efficiency to a non-mathematical audience. What is lost in rigour is recouped in understanding.
I wonder then, what Rowe's justification is behind the total lack of separation between an ADT and its possible implementations in his book. In fact, ADT isn't mentioned in the index or, as far as I could detect, neither is this pivotal concept discussed at any length in the text itself. Either formal algorithm analysis or a strong, consistent focus on the ADT concept form the legitimate backbone to any DS&A book. With neither, does this book at least pay attention to the need for accurate, readable, and squeaky-clean code? Not as far as I'm concerned, I'm afraid.