I've practiced nuring for over 15 years and have worked many States on the East coast. For the most part, nurses are trusted by the general public as well as felt to be honest professionals according to recent poll. The American public gives more respect to doctors, but they tend to trust nurses more to save their lives and show geniune compassion. Nurses still get blamed for mostly everything that goes wrong. In that wise, nurses are used often as scapegoats.
Nursing has evolved from the sharp white nursing hats, starched white dresses, neat white hoses, and neat white shoes to more casual scrubs and sneakers minus the hat. We aren't expected to be generalist working and rotating in various departments or areas of nursing or do rotating shifts. We can choose our speciality and our shift whether its peds on the day shift.
Nurses also don't have to clear the nursing station for doctors or give up our seats anymore (although in the Southern States some older doctors expect it). Doctors still yell at us, but I think that's every where in the world. In closing, most nursing organizations provide Unions that protect nursing jobs. That's a plus.