Woke is a relative term. For example, woke in Afghanistan might be women should be allowed to attend college and not wear a burka. American's didn't invent wokism, we evolved it in a crucible of Democracy and general freedom to express ourselves. It's better to think of the United States as a microcosm of inter-cultural relations. It's like a scratchpad to try new ideas when the old ones aren't working.
Even what is considered "woke" is evolving and changing all the time. At the heart of it is a desire to be AWARE or awakened to our bad habits and our biases. Just like California is sort of a scratchpad within America for where the culture is moving, America influences the world and even though we have lots of dummies, you can see we have plenty of geniuses too such as the inventors of Medium or iPhones or the Webb Telescope or mRNA vaccines. Thus do we have influence, for better or worse.
As for patriarchy, I've seen firsthand sexism at every level of society and I've seen more competent women passed over for men or paid less for doing a better same job. Look at the world's biggest messes and you'll see men running the show. Maybe there's a more just world awaiting that is neither Patriarchal nor Matriarchal, but Equiarchal. It's never been achieved but it may very well be that Americans are first to get there, led by overzealous woke activists who keep fighting the fight. If not them, then who? The status quo sucks all around the world!