Science is distrusted by society to some degree. For example, a lot of people think that scientists are in cahoots with the government, and will not find cures for disease because it's more profitable to treat disease than it is to cure it. I don't personally believe this myself, for one because scientists and doctors are not in the same profession. However, it does seem that people tend to blame scientists for a lot of things. It's certainly okay to have some healthy skepticism. For example, the news often reports a scientific study as fact, when studies are actually just meant to support theories. We should be skeptical of something until it is proven to be true through multiple, peer-reviewed studies.
There is definitely a segment of society that does not believe in scientific research and the facts that are presented. This is sometimes due to them not trusting that the data is correct or that the data is right but that we don't understand the reasoning behind it. Society needs to trust science more.
I think people look to scientists as the ones, perhaps the only ones, who will tell them the truth. However, their truth changes as new evidence is found and besides their truth is based on hypotheses that are not necessarily valid. So we may be too trusting of what the current scientists tell us.
As a whole, society trusts science in a big way. We rely upon scientists and their breakthroughs to fix common problems in civilized society. Science is a necessary thing for most people, and it's trusted by most Americans. There are certain aspects of science that Americans might not trust since some people abuse these discoveries.
Society relies on science so it can not completely distrust science. There are times when science is trying to develop something new that society has to be safe and wait for them to prove things, but this is being safe not distrustful. Once science proves things work society becomes appreciative of science.