Increasingly, businesses talk about the importance of practicing “social responsibility” – but what does that mean? Socially responsible companies adopt policies that promote the well-being of society while working to lessen negative impacts on society