Does AI mean we don't need the Semantic Web?
If you hang around with computerists long enough, they start talking about the Semantic Web. If you can represent human knowledge in a way that's easy for computers to understand it will be transformative for information processing.
But computers, traditionally, haven't been very good at parsing ambiguous human text.
Suppose you saw this text written for a human:
Our opening hours are: Weekdays 10 until 7. Weekend 10 until 10 (Early closing 9 o'clock Sunday).
Not the most straightforward sentence, but pretty easy for a human to parse.
Until recently, the best way to represent that for a computer was something like:
<meta itemprop="openingHours" content="Mo-Fr 10:00-19:00"/>
<meta itemprop="openingHours" content="Sa 10:00-22:00"/>
<meta itemprop="openingHours" content="Su 10:00-21:00"/>
or
<script type="application/ld+json">
{
"@context": "https://schema.org",
"openingHours":["Mo-Fr 10:00-19:00", "Sa 10:00-22:00", "Su 10:00-21:00"]
}
</script>
A tightly constrained vocabulary which can be precisely parsed by a simple state-machine. Easy to ingest, interpret, and query. Easy for machines, that is. As much as I love the semantic web, it is hard for humans to write, update, and maintain.
But we have AI now. So do we need to mark up documents specifically for machines?
I fed the text into OpenAI's ChatGPT. Here's what it said:

It isn't just capable of parroting back data - it can perform moderately complex reasoning:

Do we need to write for computers any more? One of the demands of the Semantic Web was that we should use HTML elements like <address>
to clearly mark up a contact address and we should wrap dates and times in the <time>
element.
Is that now redundant?
We still need to write clearly and unambiguously. But do we need separate "machine-readable" HTML if machines can now read and interpret text designed for humans?
K says:
More comments on Mastodon.