Missing audiences; working on words

A couple unrelated observations from the weekend, though perhaps so far apart as is the way of these things:

First: had a good conversation with a colleague from University about the importance of working through words. It’s a language game, and sometimes you need to sit down and do the work and get your reps in, as mentioned a few posts ago. And in works the same in comedy as it does in academia or in writing. This conversation was with respect to the Echanger episode, so… more to come there.

And speaking of comedy, that brings us to point the

Second: that part about honing the jokes through touring was also mentioned by Katt Williams in his interview on Club Shay Shay (at about the 48:30 mark):

and the timing is impeccable. (The above episode came out a day or so before I made the last post. I hadn’t seen it yet, but it aligns perfectly with the Jeselnik comment too.)

Gotta get the reps in.

Which leads us to our third point, about missing audiences.

Because, while I’ve been seeing bits from the interview all over social media (well, YouTube and TikTok), it’s been completely absent from Mastodon and the Fediverse.

It speaks to a massive hole in the Mastodon and the Fediverse more generally.

And the clip’s absence here is very telling.

Now, the most charitable argument one could make – perhaps – is that the Fediverse isn’t obsessed with celebrity culture, and isn’t interested with the beefs that actors and comedians may be having with one another.


But we know there is some celebrity and/or Hollywood discussion does exist there, if not a ton.

The audiences that are talking about the clip: Gen Z and Millenials, young people and people of colour, aren’t there having that discussion. They’re in other places.

Why are they staying away from Mastodon?

I’m curious to find out…

Swift Studies

A recent article on an academic conference devoted to Taylor Swift prompted some discussion online. The article by Emily Yahr was posted on Dec 26, 2023, here:


My response was as follows:

I felt similarly perhaps 15-20 years ago when a college offered a semester length course on the topic of Lady Gaga, and I was aghast, but then I kinda got out of it.
Taylor Swift is no different in this regard, though I think the Swifties are more of a force than the Monster ever were.
But that’s the thing: the fact that both of them have a fanbase large enough to be a) identified by name and b) make an impact beyond the pop music sphere warrants the study.

And it’s not like pop-culture focused conferences are a new thing. From the article:
“One academic told her that, after speaking at events focused on Bob Dylan, Nirvana and the Beatles, they were thrilled to discuss a prominent female artist.”
… so there is this, at least, with expanding the scope of artists that can be discussed.

I think that’s pretty swell.

(And in the interest of full disclosure, I’ve presented on pop-culture related topics academically at the PCA before, as well as several Games Studies and Film Studies conferences before.)

I think there’s value in the conference though. The budget is usually pretty minimal, relatively speaking, from my experience with a couple organizing committees. It let’s the researchers get some reps in too, which honestly can be invaluable.

And obviously there is *something* going on with the Swift and her fanbase, so a bit of scrutiny isn’t a bad thing, even if I’m not on board with Lacanian interpretations of Swift’s Folklore either.

(Or anything Lacanian,tbh.)

When it comes to the utility of examining, pop-culture, I’ll grab a quote by Bruce Sterling from a couple decades past:

“The most fertile ground for analyzing motives is pop culture – not because pop culture is deep, but because it’s so shallow. It’s where those wishes and longings are most nakedly evident” (Sterling , 2002, pxii-xiii).

It was informative when I was looking at the role of #ScienceFiction back in the early Double-Ohs. It’s still solid now.

This whole subject was on my to-do list for the podcast a couple episodes from now. Look for an episode titled “The Old Man and the River” in the new year. I’ll link back to this when it gets posted.

After PoMo

Wrote something down earlier today, and I wanted to capture it here:

Author John Shirley mentioned that a chess program he was playing had a rather chatty Deadpool bot making commentary on the game, and came up with this:

My reply:

Not ‘never again’ though. If #Deadpool is an exarch of #postmodern referentiality and Rortyian #ironism , then the way to recovery is through the #romance of just liking things for their own sake again.

(Or to quote Abed from #Community : ” I guess I just like liking things”)

(It’s how I broke bread with the #furries and #bronies They were ahead of the curve on this. We’re good now.)

And I think this is worth expanding on, but as I said last post, it’s crunch time

Films as Art…

On Mastodon on the evening of November 17th, 2023, @Ricki.Tarr asked the following question:

And the responses have been pretty amazing. But my own feeling is this:

@RickiTarr Most of them do, tbh. I think the underlying message between “Every Frame a Painting” captured it well. (Miss that channel.)

But seeing as I used to use 80s and 90s action movies for examples in my film class, I might be an outlier.

Anyhoo, a few:

#madmaxfuryroad The desert storm.
The fight on the cliff
#serenity The Serenity dolly shot to open
#GrandBudapestHotel Just the color design, in all of it.
#johnwick Fights as ballet
#matrix2 The Chateau foyer fight

And more…

/2 Been thinking about this all night, how most of my examples would come from some basic and banal films.

And I think that’s the beauty of it, is that they’re all #art

The underlying assumption in responses to the thread is that “art” is defined by something that must be transcendent or sublime, a singular work that stands above others is somehow “art” in ways that the others are not.

But I still find art in #billandtedsexcellentadventure or #videodrome or #returnofthelivingdead

3/ Or #conanthebarbarian or #jurassicpark or #thefifthelement or any of hundreds of other films I find engaging.

Now some films might push this a little bit. If I’m Dan Harmon, I might have issues with #NowYouSeeMe f’rex, and struggle to find the art there, but this is where the subjective alights within #cinema

(Waitasecond, can I still use #frex , or did some tool decide to use it as the label for an AI-generated thesaurus for your smartphone in the last week?)

4/ Anyhoo, where was I? (phone call disrupted the tenuously connected synapses there…)

Oh yeah, there’s beauty in the basics, which is really what I was getting at.

The most recent flick I saw was Aronofksy’s #PostcardFromEarth the psuedo-doc made to showcase the capabilities of #TheSphere in #LasVegas

It’s a brief sliver of a movie with a #scifi wrapper and an eco-friendly message.

It’s a singular experience, worth checking out. Most definitely #art

but so was #furyroad

Art can make you feel the little things too, is what I’m getting at.

Which is where I’m sitting at. So I’m thinking I’ll list out the films, and the frames, that strike me as “art”, however it gets (subjectively) defined, and I’ll run through the list here in the next few days.

Also, more on that Aronofsky film coming soon…

Implausipod E015 – EEE, Embrace Extend, Extinguish

EEE, or Embrace Extend Extinguish has been making the rounds again in 2023 as a number of silicon valley tech companies have been coming under the spotlight for their business practices, and some striking similarities are showing to a strategy outlined by Microsoft in an internal memo back in the 1990s. Everything old in tech is new again.


 In 1999, Judge Thomas Penfield Jackson of the U. S. District Court of Columbia issued his findings in the case of United States v. Microsoft Corp., the antitrust suit that was brought by the government on the tech giant due to allegations that it was using its power to bundle the browser with the Windows operating system, and this constituted an abuse of its monopoly position within the desktop computer market. 

During the course of the trial, it was revealed that Microsoft had an internal policy of embrace, extend, and innovate. But during the trial, witness Steven McGeady revealed that privately Microsoft executives referred to this as embrace, extend, extinguish with the goal of marginalizing or eliminating direct competition.  Other tech companies started taking notes for use in the 21st century. Let’s talk about Triple E in this week’s episode of the ImplausiPod.

Welcome to the ImplausiPod, a podcast about the intersection of art, technology, and popular culture. I’m your host, Dr. Implausible, and since we came back from hiatus with episode eight, we’ve mentioned EEE a few times in relation to things like the Fediverse, so I thought there was no better time than now to get caught up.

First off, the reason why a case from the 90s still matters in 2023 is that it never really went away, and here and now we’re starting to see some more signs of it with some big players, both new and old. Potential examples in 2023 include Facebook, Google, and again Microsoft, and it may affect things that you use on a daily basis.

Let’s cover off the main points. What is Embrace Extend Extinguish, and what does it mean for computing and the internet? EEE or Triple E That’s right, this episode is all about the game, and the game is follow the leader. Anywho, Triple E was an internal policy pursued by Microsoft in the 90s with relation to its competition in a number of key markets. First revealed during the antitrust case that I mentioned in the open, where an internal memo that was brought into evidence showed that they referred to the strategy as Embrace, Extend, and Innovate. This was part of a number of texts that were submitted into evidence, including emails and quotes from Microsoft executives and others, like Steve McGeady of Intel, where he was a VP at the time.

During testimony during the trial, McGeady revealed that they, Microsoft, had referred to it as Extinguish internally. Now, these documents are from the Antitrust case, and are separate from another set of docs, collectively referred to as the Halloween document, which will leaked to Eric S Raymond and detailed Microsoft’s attitudes and plans regarding Linux and open-source software.  Those show that Microsoft was still aggressive against competition but had to use a different approach due to the distributed and non-commercial nature of the FOSS community. Here, they pursued tactics like the development of FUD:  fear, uncertainty, and doubt, or announcing vaporware products, stuff that would compete with a given product if it came to market, but they had no intention of ever actually making.

They’d also engage in the practice of extending protocols and developing new ones, and de-commoditizing existing protocols in order to crater the market for stuff that was running on it. And from these latter documents, we can better see what their corporate strategy goals were. It was a set of social and policy actions which they used to maintain their market position against other vendors, who often had better technological solutions, similar to what we talked about in the Endless September episode, where AOL had a technically inferior product, but were able to compete on presence in the marketplace with the ubiquitous floppy disks and CD coasters and a streamlined user experience, this was one of the reasons that the case was so important.

By using their market size to shut out other vendors from the market, they were stifling innovation and preventing competition. And this is something that still raised some eyebrows back in the 90s. With the original case, Microsoft ran afoul of the Sherman Antitrust Act. It was a business-to-business crime, B2B, so when the afflicted parties petitioned the U.S. government about the impacts and the concerns were raised about the lack of competing alternatives, they, the government, eventually took action. 

As a reminder, this was before smartphones were a thing in the market and shifted. Apple had a tiny fraction of the desktop market, around 3 percent in 1999, and Linux was very niche and other operating systems were mostly found use within specific corporate use cases, but had a tiny user base compared to windows as well. All told Microsoft was on about 95 percent of all desktops and laptops sold. And this number was actually growing through the Y2K period up to the dot com crash.

And the reason we’re bringing it up here again in 2023 is that apparently everything old in tech is new again. There’s been the rollout of some new apps, programs, and tools, and there’s a number of court cases actually taking place right now in the fall of 2023 involving major tech players that you’re not hearing about because of other criminal enterprises currently in the news.

So I’m going to take a moment to cover each of them in turn and how they relate back to Triple E and cover some of the theoretical background while we’re doing this as well. And the first one we want to talk about, of course, is the one that started this whole thing. Threads, the Twitter like communication app, launched by Meta, nee Facebook, under their Instagram brand, was made available to users on July 5th, 2023. 

Now prior to its launch, there had been rumors of its development. On an article on TheVerge. com on June 8th by Alex Heath, they had gone into the details of the app, which at the time was called “Project 92”. The main rumor was that it would be using something called the ActivityPub Protocol, which as we’ve discussed plenty of times, is the thing that’s powering Mastodon and the rest of the Fediverse, and this rumor caused a lot of consternation, especially within the Fediverse at large, mostly due to Meta’s past track record, which hasn’t been great. If you’re wondering what kind of things might be involved, just do a web search for Cambridge Analytica, or for Rohingya in Myanmar. Don’t search for it on any Meta owned properties, because you won’t find much and for those reasons and more a number of the people that were already on the Fediverse that were early adopters of the protocol were engaged with it because it was explicitly not a Facebook property.

So when a post was made on June 18th by an admin from one of the larger instances on Mastodon that, yes, they’d been in discussion with Meta regarding the ActivityPub protocol and the possible integration that would take place, there was a lot of uproar and consternation, and one of the things that got mentioned a lot during the ensuing discussion was the idea of Triple E. Now admins of some other instances and some users said they were going to pre-block meta because they’re concerned that any particular connection with them may allow leverage or for their information to be shared.

You know, they’d be turned into a commodity, much like we’ve discussed earlier. There are those who are online who don’t want any part with Facebook. And the other concern was that Facebook would go full triple E on the ActivityPub protocol: embrace it, by letting Threads link to it directly; extend it in some meta-friendly way, probably by allowing advertising or something similar; and then extinguish it ultimately at some unspecified point in the future as they roll on to a new program or a new platform, but in much the same way that we’ve seen with standard operating procedure for Microsoft back in the 90s. In so doing, the people that had found a home away from Meta, away from Facebook, would lose their online homes, so you can understand their concerns, but there’s a related set of concerns tied directly to the triple E phenomenon, and that is the notion of path dependency and vendor lock-in. 

There’s an old story, we might call it a meme, that does the rounds on the Internet every six to nine months or so. Stop me if you’ve heard it. It goes the size of the space shuttle’s boosters was determined by the width of a roman chariot, or two donkeys or something like that. I’ll let you look it up. There’s a couple recent examples Also, i’m not going to stop even if you’ve heard it. 

Here’s the full story: as it goes, the diameter of the space shuttle boosters are the size they were due to the fact they had to be shipped via rail cross country from Utah to Florida. Standard gauge railroads in North America are 4 foot 8.5 inches. The size of the standard rail gauge is because the Americans bought their early equipment from the English who used a similar gauge for their equipment. And this was fixed because the English tram manufacturers designed their wagons to fit the roads of the English countryside. And these were set at the distance because of the Roman chariots that had driven on the roads millennia before and had worn groves in the roads, which had then been used for generations of Englishmen. So the width of the train tracks was directly influenced by the width of the two Roman horses, or donkeys. There’s variations in the stories, you may have heard it differently. 

It is, of course, nonsense. 

The size of a donkey had very little to do with the size of the Space Shuttle. There were multiple different standards of rail lines in use in North America between 1831 and 1981 when the Space Shuttle first launched, but its design had begun significantly earlier. Any of these could have been the standard, though again, there were some significant advantages that some gauges had over others. More on this later. But tracing the links of contingencies, facts, and counterfactuals necessary to draw a straight line from donkey carts to rocket boosters requires levels of hand waving once reserved for members of the royal family.  It just ain’t a thing. 

Especially when you consider that the diameter of the SRB is 12.17 feet. You’d need to be doing some Steiner math to get that story to work. But what it does illustrate is the idea of path dependency, the link which is back that might be caused by initial embedded choices. And I know this may seem like an odd rhetorical strategy, undermining a specific well-known example in the aid of explaining what it is, but in this case the particular illuminates the general case, even if it doesn’t specifically abide by it.

Path dependence can be a real issue, especially when it comes to technology. It’s usually brought up in terms of standards. We can think of things like the QWERTY keyboard design, or the various forms of coffee pods that are used as shaping the direction of the market. And these can both be True, but to really get a hand on path dependency, let’s think about it in terms of something massive, like really big, like the automotive market in North America. It’s so big and entrenched that makes substantive changes to it would be extremely difficult. So how would one go about changing the auto system? By using something that can overlap with the grooves that are already cut to a greater or lesser degree. You add in electric vehicles that mirror the shape and conform to the systems that are already present and offer charging stations that resemble in some fashion the filling stations that are currently familiar to your audience so that they can be more easily adopted. Moving to electrical vehicles that look like cars leverages over a century of design decisions and development and allows for an easier adoption for new customers, or at least that’s how the thinking goes. So electric cars follow the path dependency laid down by successive generations of gas-powered automobile designs and drivers.

What’s related to path dependency, though not exclusive from it, is the idea of technological lock-in. And this is where those K Cups and keycaps come back into the picture. The keycaps in this instance are the ones that spell out Q W E R T Y on the top of your keyboard. Though in this day and age, you can order a version that spells out anything you like.  (At some point, we’re going to have to have a chat about innovation as a driver of change in secondary or tertiary markets, but we’ll move on for now.) 

So the idea of path dependency really came about from the field of evolutionary economics. Paul David wrote about it in 1985, about the risks of technological walk in, in his famous paper on “Clio and the economics of QWERTY”. Okay, famous among economists, but still famous. Clout’s clout, right? David was writing about the historical competition between two famous keyboard layouts, the QWERTY keyboard, the one that you’re likely familiar with, as well as the DSK or Dvorak standard keyboard. DSK was patented in 1932, and it was faster, better, more efficient, and the U.S. Navy even tested it out and found that it only took about 10 days or so to recover the cost and retraining. The DSK or Dvorak keyboard was about 20 to 40 percent more efficient than the QWERTY version. 

Now, the QWERTY version had already existed for a while. It was patented between 1909 and 1924, depending on what country you’re in. Originally developed by Christopher Latham Scholes of Milwaukee, Wisconsin, and some of his partners, including Carlos Glidden and Samuel Sewell. Now, they were engaged with, uh, let’s see, I guess, entrepreneur, James Densmore, you might want to say, promoter slash venture capitalist. And Densmore had some contacts with a manufacturing company that had some significant machine tool capabilities, an arms manufacturer by the name of Remington. They were also getting into sewing machines at the time, you know, diversifying the portfolio, so to speak. And while business was good during the civil war, the economic downturn that followed in the decade after in the 1870s meant that sales weren’t much. They were selling just for the record, about 1200 units a year.  So at the time typewriter sales are more like what we see with like mainframe computer systems today, but at the time in the 1870s, there was actually a lot of development going on. Edison was working on his teletype machines and there’s patents for that in 1870s There’s a lot of other communication equipment being developed and it was being rolled out across the country.

So it was actually A lot of innovation taking place within that space. And in that we have the development of the QWERTY keyboard. There was other competing types as well. Like we said, the Dvorak didn’t come around until the 20th century. There was the “Ideal” keyboard, which had the sequence D H I A T E N S O R in the home row, those 10 letters being ones you could compose 70 percent of the words in the English language with. And all of this development was indicative of a lot of growth going on in the field. The singular advantage that QWERTY had was that, you know, it slowed down the typist so it didn’t jam as often. And that led to a small but minute advantage over some of the other competitors, in addition to having like Remington being the manufacturer for them.  And this advantage was multiplied with the advent of touch typing in the 1880s, as the hunt-and-peck method kind of fell out of use. Keyboardists that could type by touch were in demand because that learned skill of being able to use a QWERTY keyboard meant that they were that much more efficient, at least compared to the hunt-and-peck typist, and again, like we said, the tech wouldn’t jam up and result in a slowdown. And it was this learned skill that led to the technological lock in and a suboptimal design like the QWERTY keyboard being the de-facto standard. 

As David described, there was three characteristics that led to this. There was tech interrelatedness, there were economies of scale, and the quasi-irreversibility of learning the skill. 

Now the tech interrelatedness was the link between the hardware of the typewriter and the software of the typist, or we might rather say wetware of the typist. To use Rudy Rucker’s term, I mean, the particular arrangement of any given keyboard was largely irrelevant. But the installed user base, so to speak, of the typists that were able to use that arrangement quickly and efficiently by memory was much more important.

The economies of scale were linked directly to the manufacturing capabilities that Remington had. As we said, they had a great machine tool set up. So they were able to produce the equipment. And then as other vendors adopted it, it was more and more available for other typists to use. So if a typist is going to pick among any given available option to use, they might as well learn QWERTY because people were paying for people that can use it.

And the training wasn’t for free, right? The typist had to learn it on their own and then bring the skill to the company and it wasn’t being handed out there. And this relates to quasi-irreversibility as well. Like you can retrain, but it’s going to cost you money. And while you’re retraining, you’re obviously not earning anything and you may still have some crossover or issues, and you don’t know if the thing you’re training to is going to be any better than the one you already know. In this case, if you know QWERTY, you’re probably going to stick with a QWERTY keyboard or demand that at your new employer. Like I can offer QWERTY, do you have it? Similar to what we see with like Adobe Photoshop and other technological versions that are currently extant.

But this is ultimately one of the problems and downsides of path dependency and lock-in, and to quote David, as he states: “competition in the absence of perfect futures markets drove the industry prematurely into standardization on the wrong system.” End quote. Because nobody could really see that the technical problems that the QWERTY system was designed to solve would soon need to be resolved, and here we are in 2023 with electronic keyboards still using this same layout even though it has no impact because it’s designed to resolve a mechanical issue that came about 150 years ago.

So yeah, if you don’t necessarily have the best technical solution like VHS or AOL or Microsoft in this instance, try unlocking the market by other means. The path dependency means it may pay off for you in the long run if you can stick around. 

And just to bring this back around full circle to our example of Roman roads, rail lines, and rocket ships, that’s an example of path dependency.  There’s no direct causal relationship, which is what everybody gets wrong about it. As David states: “important influences upon the eventual outcome can be exerted by temporally remote events, including happenings dominated by chance. There are things that shape our economic decisions that are well beyond our ability to fathom or even control.”

Now earlier I did state that there was a number of examples like Triple E or things like it in the news and it’d be prudent to get onto the next one. Now one of the bones of contention in the Microsoft antitrust case was their bundling of Internet Explorer with the Windows operating system. People said that that was anti competitive and that they’re using their monopoly power to push that as a de-facto standard. And that’s one of the ways that lock in can happen when a functional standard becomes a de facto standard. Now, currently we’re seeing this with Chromium, which is the engine behind Google’s Chrome browser and used by everything from Edge to Opera to Chrome itself. And it’ll also be in the default install on every Android device.

Much like how Microsoft’s Windows in the 1990s was about 95 percent of the personal computing market, Google’s Chromium makes up about 95 percent of the browser market in 2023. The alternatives are pretty much limited to Firefox, Safari, and a few derivatives. So when Google decides to make major changes to Chromium, it can reverberate throughout the industry. It affects everybody. And in late July and early August, they started doing that. They rolled out something called WEI or Web Environment Integrity as a proposed change to Chromium. It first appeared in July as a proposal in the GitHub repos of some of Google’s Chromium engineers, and it received a pretty universal outcry against it from those that were paying attention to it.  What it proposes is that there’s an attestation check that’s done between the browser and the hardware of the machine. Ostensibly it’s used to combat online piracy or cheating in games, but the problem is that those are edge cases and it could be used for other purposes. One of the ones most noticed is it could be used to detect whether somebody’s running an ad blocker on their browser or single out specific extensions.  It turns the internet into a permission-based system rather than an allowable system. It turns everything into a walled garden run by Google where they can pass judgment on the users based on whatever opaque criteria they might have. And while that’s one example, that’s not the only case currently involving Google.

The other one that’s going on right now is the antitrust case that was brought by the U. S. Department of Justice against it for its monopoly power with regards to online search. And if you haven’t heard much about that one, it’s not surprising because Google’s been doing pretty much everything it can to limit the exposure or any information that’s coming out of the trial. Much of it’s happening behind closed doors. There’s been some reporting on the New York Times, Bloomberg and Ars Technica, and I’ll put some links to that in the notes. 

And that’s not the only case going on because on September 26, 2023, the FTC in the U S and 17 state attorneys general sued Amazon.com alleging that the online retail and technology company is a monopolist that uses a “set of interlocking, anti competitive, and unfair strategies to illegally maintain its monopoly power. The FTC and its state partners say Amazon’s actions allow it to stop rivals and sellers from lowering prices, degrade quality for shoppers, overcharge sellers, stifle innovation, and prevent rivals from fairly competing against Amazon. It alleges that Amazon violates the law not because it is big, but because it engages in a course of exclusionary conduct that prevents current competitors from growing and new competitors from emerging.” End quote. At the time of recording, that’s just a couple of days old. So as they say, more to come.

Now there’s nothing in particular that links an alleged monopoly in online shopping to another one that’s alleged for online search to a potential one for, uh, social networking to another one that has the impact of online browsing that maybe links it to one, another, uh, case that, uh, dealt with monopoly regarding operating systems and online browsing from, you know, 20 years ago, but there are some commonalities. Aside from them all being massive tech companies, and in some cases the same ones. As Bill Gates commented in 2019 on the 20th anniversary of the antitrust suit, one of the things the tech companies learned is that they had to be more present in Washington and to lobby more effectively.

Back in the 90s, it was Bill Gates point of pride that they never really engaged with lobbyists. But they changed their strategy with respect to that following the antitrust trial. And everybody else in the tech industry took notes and followed suit. Now, depending on your level of involvement in online tech news, a lot of what we shared here may seem like common knowledge, but not everyone may share that.

What we’re trying to do is just bring attention to the ongoing events that are still taking place, especially with everyone’s eyes thoroughly focused on things like LLMs, generative AI tools like chat GPT. These are just current examples, high profile ones that attract our attention. And there’s others that are happening at various levels of technological development that we might not see or might not have a large impact just because it’s affecting a very niche audience and doesn’t have the broad reach of things like shopping and search and browsing and social media.

What I hope to bring to your attention is the impacts that things like locking and path dependency can have on that development, that it can reduce the available options, and we maybe get stuck with an outmoded technology, something like a QWERTY keyboard, where there would be better solutions available to us.

Because it keeps happening again and again and again, maybe it isn’t necessarily a case of path dependency where we keep falling into the ruts that have been well worn before, but rather perhaps the environment as a whole affords certain outcomes in a regulatory framework of monopoly capitalism that we’ve discussed in the past.  We can see it more often happening in such a framework. So rather than being one particular path, the slopes of the hill encourage flows in certain directions. Exploring this would shift us more thoroughly into evolutionary economics full stop, which we’ll leave for a future episode, a path off in the distance.

Next time, in episode 16, we’ll be looking at spreadable media, which we’ve hinted at earlier. And with the WGA strike being potentially resolved by the time you hear this, with hopefully the SAG AFTRA strike soon to follow, we may be returning to some media focused episodes soon, too. Until next time, I’ve been your host, Dr. Implausible. You can contact me at drimplausible at implausipod. com. Have fun.