Newsletter sign-up
View all newsletters

Enterprise Java Newsletter
Stay up to date on the latest tutorials and Java community news posted on JavaWorld

Sponsored Links

Optimize with a SATA RAID Storage Solution
Range of capacities as low as $1250 per TB. Ideal if you currently rely on servers/disks/JBODs

Taming Tiger, Part 2

Understanding generics

  • Print
  • Feedback

Page 2 of 3

   Vector<A> v = new Vector<A>();


The type is optional to maintain backwards compatibility with legacy code. If the type were mandatory, then any code with a declaration such as Vector v = new Vector(); would no longer compile. Once a parameterized type is declared, using it is no different than a regular (pre 1.5) nonparameterized type.

There are subtleties to keep in mind while using generics. Consider the following seemingly harmless code fragment:

// Somewhere in the program create a Vector 
Vector<String> v2 = new Vector<String>();
Vector<Object> v = v2;


Compile the code and run it. What? It won't compile. Even though Object is the base class of all classes including String, you cannot assign a vector of type String to a vector of type Object. The converse is also true and is more intuitive than the former. The reason such rules exist are to ensure compile-time type safety and to prevent errors such as the following:

// Somewhere in the program create a Vector 
Vector<String> v2 = new Vector<String>();
Vector<Object> v = v2;
v.add(new A());
String s = v2.get(0);


Assuming that J2SE 1.5 relaxed its assignment rules, the above code would compile, but would result in a runtime exception when the line String s = v2.get(0) executes.

Using wildcards

To understand what wildcards are and why we need them, consider the following code fragment:

// Declare a Mammal class
abstract class Mammal
{
   abstract public void eat();
}
// Declare a Dog class
class Dog extends Mammal
{
   public void eat() 
{
   System.out.println("Dog is eating.");
}
}
// Declare a Cat class
class Cat extends Mammal
{
   public void eat() 
{
   System.out.println("Cat is eating.");
}
}
// Somewhere in the code create a vector of dogs and cats
Vector<Dog> dogs = new Vector<Dog>();
dogs.add(new Dog());
dogs.add(new Dog());
Vector<Cat> cats = new Vector<Cat>();
cats.add(new Cat());
cats.add(new Cat());
cats.add(new Cat());
// Now make all the dogs and cats eat
Vector<Mammal> pets = dogs;
for(Mammal pet: pets)
{
   pet.eat();
}
pets = cats;
for(Mammal pet: pets)
{
   pet.eat();
}


By now we already know that this code when compiled will produce the following errors:

incompatible types
found   : java.util.Vector<Dog>
required: java.util.Vector<Mammal>
                Vector<Mammal> pets = dogs;
                                      ^
incompatible types
found   : java.util.Vector<Cat>
required: java.util.Vector<Mammal>
                pets = cats; 


But what if I wanted to make this code work? After all, it is legitimate—albeit contrived—code. The answer is simple and involves using wildcards. Here's the one line change that will appease the Java compiler:

   Vector<? extends Mammal> pets = dogs;


Now after you successfully compile and run the code, you will receive the following output:

Dog is eating.
Dog is eating.
Cat is eating.
Cat is eating.
Cat is eating.


So how did that work? The ? is a wildcard character that, along with the extends keyword, declares pets as a Vector of any type that derives from Mammal. That differs from the line we wrote before that declared pets as a Vector that could only contain objects of the direct type Mammal. To accept a Vector of any kind of object you could use the following declaration:

  • Print
  • Feedback

Resources