1. caeab08 slub page alloc fallback: Enable interrupts for GFP_WAIT. by Christoph Lameter · 16 years ago
  2. b621038 slub: Do not cross cacheline boundaries for very small objects by Nick Piggin · 16 years ago
  3. b773ad7 slub statistics: Fix check for DEACTIVATE_REMOTE_FREES by Christoph Lameter · 16 years ago
  4. 62e5c4b slub: fix possible NULL pointer dereference by Cyrill Gorcunov · 16 years ago
  5. f619cfe slub: Add kmalloc_large_node() to support kmalloc_node fallback by Christoph Lameter · 16 years ago
  6. 7693143 slub: look up object from the freelist once by Pekka J Enberg · 16 years ago
  7. 6446faa slub: Fix up comments by Christoph Lameter · 16 years ago
  8. d8b42bf slub: Rearrange #ifdef CONFIG_SLUB_DEBUG in calculate_sizes() by Christoph Lameter · 16 years ago
  9. ae20bfd slub: Remove BUG_ON() from ksize and omit checks for !SLUB_DEBUG by Christoph Lameter · 16 years ago
  10. 27d9e4e slub: Use the objsize from the kmem_cache_cpu structure by Christoph Lameter · 16 years ago
  11. d692ef6 slub: Remove useless checks in alloc_debug_processing by Christoph Lameter · 16 years ago
  12. e153362 slub: Remove objsize check in kmem_cache_flags() by Christoph Lameter · 16 years ago
  13. d9acf4b slub: rename slab_objects to show_slab_objects by Christoph Lameter · 16 years ago
  14. a973e9d Revert "unique end pointer" patch by Christoph Lameter · 16 years ago
  15. 00e962c Revert "SLUB: Alternate fast paths using cmpxchg_local" by Linus Torvalds · 16 years ago
  16. 331dc55 slub: Support 4k kmallocs again to compensate for page allocator slowness by Christoph Lameter · 16 years ago
  17. 71c7a06 slub: Fallback to kmalloc_large for failing higher order allocs by Christoph Lameter · 16 years ago
  18. b7a49f0 slub: Determine gfpflags once and not every time a slab is allocated by Christoph Lameter · 16 years ago
  19. dada123 make slub.c:slab_address() static by Adrian Bunk · 16 years ago
  20. eada35e slub: kmalloc page allocator pass-through cleanup by Pekka Enberg · 16 years ago
  21. 3adbefe SLUB: fix checkpatch warnings by Ingo Molnar · 17 years ago
  22. a76d354 Use non atomic unlock by Nick Piggin · 17 years ago
  23. 8ff12cf SLUB: Support for performance statistics by Christoph Lameter · 16 years ago
  24. 1f84260 SLUB: Alternate fast paths using cmpxchg_local by Christoph Lameter · 17 years ago
  25. 683d0ba SLUB: Use unique end pointer for each slab page. by Christoph Lameter · 17 years ago
  26. 5bb983b SLUB: Deal with annoying gcc warning on kfree() by Christoph Lameter · 16 years ago
  27. ba84c73 SLUB: Do not upset lockdep by root · 17 years ago
  28. 0642878 SLUB: Fix coding style violations by Pekka Enberg · 17 years ago
  29. 7c2e132 Add parameter to add_partial to avoid having two functions by Christoph Lameter · 17 years ago
  30. 9824601 SLUB: rename defrag to remote_node_defrag_ratio by Christoph Lameter · 17 years ago
  31. f61396a Move count_partial before kmem_cache_shrink by Christoph Lameter · 17 years ago
  32. 151c602 SLUB: Fix sysfs refcounting by Christoph Lameter · 17 years ago
  33. e374d48 slub: fix shadowed variable sparse warnings by Harvey Harrison · 17 years ago
  34. 1eada11 Kobject: convert mm/slub.c to use kobject_init/add_ng() by Greg Kroah-Hartman · 17 years ago
  35. 0ff21e4 kobject: convert kernel_kset to be a kobject by Greg Kroah-Hartman · 17 years ago
  36. 081248d kset: move /sys/slab to /sys/kernel/slab by Greg Kroah-Hartman · 17 years ago
  37. 27c3a31 kset: convert slub to use kset_create by Greg Kroah-Hartman · 17 years ago
  38. 3514fac kobject: remove struct kobj_type from struct kset by Greg Kroah-Hartman · 17 years ago
  39. 158a962 Unify /proc/slabinfo configuration by Linus Torvalds · 17 years ago
  40. 57ed3ed slub: provide /proc/slabinfo by Pekka J Enberg · 17 years ago
  41. 76be895 SLUB: Improve hackbench speed by Christoph Lameter · 17 years ago
  42. 3811dbf SLUB: remove useless masking of GFP_ZERO by Christoph Lameter · 17 years ago
  43. 7fd2725 Avoid double memclear() in SLOB/SLUB by Linus Torvalds · 17 years ago
  44. 294a80a SLUB's ksize() fails for size > 2048 by Vegard Nossum · 17 years ago
  45. efe4418 SLUB: killed the unused "end" variable by Denis Cheng · 17 years ago
  46. 05aa345 SLUB: Fix memory leak by not reusing cpu_slab by Christoph Lameter · 17 years ago
  47. 27bb628 missing atomic_read_long() in slub.c by Al Viro · 17 years ago
  48. b9049e2 memory hotplug: make kmem_cache_node for SLUB on memory online avoid panic by Yasunori Goto · 17 years ago
  49. 4ba9b9d Slab API: remove useless ctor parameter and reorder parameters by Christoph Lameter · 17 years ago
  50. b811c20 SLUB: simplify IRQ off handling by Christoph Lameter · 17 years ago
  51. ea3061d slub: list_locations() can use GFP_TEMPORARY by Andrew Morton · 17 years ago
  52. 42a9fdb SLUB: Optimize cacheline use for zeroing by Christoph Lameter · 17 years ago
  53. 4c93c355 SLUB: Place kmem_cache_cpu structures in a NUMA aware way by Christoph Lameter · 17 years ago
  54. ee3c72a SLUB: Avoid touching page struct when freeing to per cpu slab by Christoph Lameter · 17 years ago
  55. b3fba8d SLUB: Move page->offset to kmem_cache_cpu->offset by Christoph Lameter · 17 years ago
  56. 8e65d24 SLUB: Do not use page->mapping by Christoph Lameter · 17 years ago
  57. dfb4f09 SLUB: Avoid page struct cacheline bouncing due to remote frees to cpu slab by Christoph Lameter · 17 years ago
  58. e12ba74 Group short-lived and reclaimable kernel allocations by Mel Gorman · 17 years ago
  59. 6cb0622 Categorize GFP flags by Christoph Lameter · 17 years ago
  60. f64dc58 Memoryless nodes: SLUB support by Christoph Lameter · 17 years ago
  61. ef8b452 Slab allocators: fail if ksize is called with a NULL parameter by Christoph Lameter · 17 years ago
  62. 2408c55 {slub, slob}: use unlikely() for kfree(ZERO_OR_NULL_PTR) check by Satyam Sharma · 17 years ago
  63. aadb4bc SLUB: direct pass through of page size or higher kmalloc requests by Christoph Lameter · 17 years ago
  64. 1cd7daa slub.c:early_kmem_cache_node_alloc() shouldn't be __init by Adrian Bunk · 17 years ago
  65. ba0268a SLUB: accurately compare debug flags during slab cache merge by Christoph Lameter · 17 years ago
  66. 5d540fb slub: do not fail if we cannot register a slab with sysfs by Christoph Lameter · 17 years ago
  67. a2f92ee SLUB: do not fail on broken memory configurations by Christoph Lameter · 17 years ago
  68. 9e86943 SLUB: use atomic_long_read for atomic_long variables by Christoph Lameter · 17 years ago
  69. 1ceef40 SLUB: Fix dynamic dma kmalloc cache creation by Christoph Lameter · 17 years ago
  70. fcda3d8 SLUB: Remove checks for MAX_PARTIAL from kmem_cache_shrink by Christoph Lameter · 17 years ago
  71. 2208b76 slub: fix bug in slub debug support by Peter Zijlstra · 17 years ago
  72. 02febdf slub: add lock debugging check by Peter Zijlstra · 17 years ago
  73. 20c2df8 mm: Remove slab destructors from kmem_cache_create(). by Paul Mundt · 17 years ago
  74. 9550b10 slub: fix ksize() for zero-sized pointers by Linus Torvalds · 17 years ago
  75. 8ab1372 SLUB: Fix CONFIG_SLUB_DEBUG use for CONFIG_NUMA by Christoph Lameter · 17 years ago
  76. a0e1d1b SLUB: Move sysfs operations outside of slub_lock by Christoph Lameter · 17 years ago
  77. 434e245 SLUB: Do not allocate object bit array on stack by Christoph Lameter · 17 years ago
  78. 81cda66 Slab allocators: Cleanup zeroing allocations by Christoph Lameter · 17 years ago
  79. ce15fea SLUB: Do not use length parameter in slab_alloc() by Christoph Lameter · 17 years ago
  80. 12ad684 SLUB: Style fix up the loop to disable small slabs by Christoph Lameter · 17 years ago
  81. 5af328a mm/slub.c: make code static by Adrian Bunk · 17 years ago
  82. 7b55f62 SLUB: Simplify dma index -> size calculation by Christoph Lameter · 17 years ago
  83. f1b2633 SLUB: faster more efficient slab determination for __kmalloc by Christoph Lameter · 17 years ago
  84. dfce864 SLUB: do proper locking during dma slab creation by Christoph Lameter · 17 years ago
  85. 2e443fd SLUB: extract dma_kmalloc_cache from get_cache. by Christoph Lameter · 17 years ago
  86. 0c71001 SLUB: add some more inlines and #ifdef CONFIG_SLUB_DEBUG by Christoph Lameter · 17 years ago
  87. d07dbea Slab allocators: support __GFP_ZERO in all allocators by Christoph Lameter · 17 years ago
  88. 6cb8f91 Slab allocators: consistent ZERO_SIZE_PTR support and NULL result semantics by Christoph Lameter · 17 years ago
  89. ef2ad80 Slab allocators: consolidate code for krealloc in mm/util.c by Christoph Lameter · 17 years ago
  90. d45f39c SLUB Debug: fix initial object debug state of NUMA bootstrap objects by Christoph Lameter · 17 years ago
  91. 6300ea7 SLUB: ensure that the number of objects per slab stays low for high orders by Christoph Lameter · 17 years ago
  92. 68dff6a SLUB slab validation: Move tracking information alloc outside of lock by Christoph Lameter · 17 years ago
  93. 5b95a4ac SLUB: use list_for_each_entry for loops over all slabs by Christoph Lameter · 17 years ago
  94. 2492268 SLUB: change error reporting format to follow lockdep loosely by Christoph Lameter · 17 years ago
  95. f0630ff SLUB: support slub_debug on by default by Christoph Lameter · 17 years ago
  96. d23cf67 slub: remove useless EXPORT_SYMBOL by Christoph Lameter · 17 years ago
  97. dbc55fa SLUB: Make lockdep happy by not calling add_partial with interrupts enabled during bootstrap by Christoph Lameter · 17 years ago
  98. 8496634 SLUB: fix behavior if the text output of list_locations overflows PAGE_SIZE by Christoph Lameter · 17 years ago
  99. 4b356be SLUB: minimum alignment fixes by Christoph Lameter · 17 years ago
  100. dd08c40 SLUB slab validation: Alloc while interrupts are disabled must use GFP_ATOMIC by Christoph Lameter · 17 years ago