I think the answer changed about the beginning of this century, because before then there simply was no such thing as guided, systematic R&D. Previous epochs didn't have their technology under even the tenuous control that we do, so culture (in the more restricted sense) could not determine technology, which grew catch-as-catch-can. What culture could do was pick and choose from the existing technologies, selecting some for rhetorical, ornamental, figurative, emblematic uses. Thus clocks became important figures, but not windmills or other self-regulating machines (see Otto Mayr, Authority, Liberty and Automatic Machinery in Early Modern Europe), and automobiles, airplanes and dynamoes were all emblematic in the early decades of this century, but not punch-card tabulators or automatic switching circuits.
Today, of course, things are different. If we (collectively; in practice, the comparatively small number of people who disburse research funds) want to push technology in a certain direction, we can, provided that direction isn't altogether absurd. Naturally, there is a strong self-fulfilling element to this: first, someone at ARPA decides that (to make up some technobabble) cognitive triage is the Way; then, some smart people with strong ARPA backing start working on cognitive triage, and, being clever, well-funded people with a workable scheme, get some nifty results; then NSF starts pushing cognitive triage; and you can take it from there. There are limits to this, of course, and the diversity of biases between funding sources provides for some competition and selection, but its adequacy may be doubted without slipping into social-constructionist heresy.
This doesn't mean technical emphases arrived at in this manner are bad, or even that better ones are available, but it does undercut arguments of the form ``X is embodied in all these nifty new gadgets and trends, so X is the Way.''