In most of the places where I've worked, the sales team was never seen positively. It seemed working in sales made you a worse person for it.
It seems sales is seen as a manipulation effort to extract money from people, barely above theft.
It's time to adjust our perspectives, no?
I was once travelling with a sales rep and he told me "you know, sales is the team that goes and gets the money for your paycheck. It's great that I get to sell a product that works so well." There is the symbiosis. We need sales to develop relationships with our clients and we need a great product and great service to keep the client coming back.
Sales is really only pre-sales customer service.