MIPS: New macro smp_mb__before_llsc.

Replace some instances of smp_llsc_mb() with a new macro
smp_mb__before_llsc().  It is used before ll/sc sequences that are
documented as needing write barrier semantics.

The default implementation of smp_mb__before_llsc() is just smp_llsc_mb(),
so there are no changes in semantics.

Also simplify definition of smp_mb(), smp_rmb(), and smp_wmb() to be just
barrier() in the non-SMP case.

Signed-off-by: David Daney <ddaney@caviumnetworks.com>
To: linux-mips@linux-mips.org
Patchwork: http://patchwork.linux-mips.org/patch/851/
Signed-off-by: Ralf Baechle <ralf@linux-mips.org>
diff --git a/arch/mips/include/asm/spinlock.h b/arch/mips/include/asm/spinlock.h
index 21ef9ef..5f16696 100644
--- a/arch/mips/include/asm/spinlock.h
+++ b/arch/mips/include/asm/spinlock.h
@@ -138,7 +138,7 @@
 {
 	int tmp;
 
-	smp_llsc_mb();
+	smp_mb__before_llsc();
 
 	if (R10000_LLSC_WAR) {
 		__asm__ __volatile__ (
@@ -305,7 +305,7 @@
 {
 	unsigned int tmp;
 
-	smp_llsc_mb();
+	smp_mb__before_llsc();
 
 	if (R10000_LLSC_WAR) {
 		__asm__ __volatile__(