in reply to
Data structures in Perl. A C programmer's perspective.
Be warned that with modern CPUs, the conventional wisdom that linked lists
should be preferred to vectors when performing many insertions and deletions
in the middle of the list is almost always wrong.
From bulldozer blog:
The vector still kills the linked list in this test. And thatís all because vector stores its data in a contiguous block of memory. The vector has so much better performance (and as you increase the size of your data structure, the difference in performance becomes much more pronounced) because it accesses memory linearly.
The memory is so slow when you access it randomly, that the penalty for cache misses almost always dominates your CPU usage time. If you care at all about performance, the main memory should be considered a sequential-access device. You have been lied to all your life; RAM is not a random-access memory device!
This has been true in the last few years and in the future it will only become more so, barring any true revolutions in hardware technology (think Ansible!) or a fundamental change in computing model (think abandoning the von Neumann model,) of which there is no sign yet (and it would take decades, even if we had an alternative model.) This particular performance killer will get worse and worse with time.