Are lexicons silos?

I have a PDS. I created a lexicon to describe some specific data I have on my webisite. I created records of this data. These records are not viewable anywhere because there are no AppViews that know about my lexicon.

Standard Site proposes to be a solution for “long-form publishing on AT Protocol. Making content easier to discover, index, and move across the ATmosphere”. Indeed if I wrap my lexicon in a site.standard.document it does show up on leaflet.pub and other sites, but it still can’t actually display my data. If I examine posts originating from leaflet.pub I see that they also are using a proprietary lexicon wrapped in a site.standard.document shell. I see also that several of the other sites that are using site.standard.document are using their own lexicons that look and function much like leaflet.pub, but nonetheless are not compatible. (I’m not implying that one copied the other, and I do not know which of these sites came first.)

We still are not providing a common format to be able to display data across multiple dissimilar platforms. Of course such a format exists, and has for a long time. It’s called HTML, and I can use it to describe a document that can be read by any web browser on any platform.

This is not meant to be a criticism of ATproto or lexicons, or anything really. Just my observations.

2 Likes

You might be interested in the talk by Dan Abramov happening right now: Social Components.

1 Like

Nah, I can just request them from your PDS or via tools like slingshot or constellation.

4 Likes

I agree that drastically subsetting the capabilities of the HTTP web (content negotiation, for example, or advanced HTML tactics) leaves a ton of interop and backwards compatibility on the table. It’s kinda my biggest regret in my own work to date: atp is kind of a private web or splinterweb, in w3c terms.

To your point about lexica being a hollow shell without some kind of micro reference implementation, I would point to the the tiles work, specifically lexicon-specific intents, which is admittedly early and more of a glint in the eye of a plan for a plan than a solution to what you’re describing, but hey, maybe something you would want to track or contribute to. It addresses your usecase but not your philosophical point, perhaps, because it is a splinterweb/private-web solution to the kind of openworld semantics issue that remains difficult to solve without, idunno, the Semantic Web?

2 Likes

One thing that might be useful for this in the future is:

It’s similar in concept to Cambria, and is a way to convert lexicons between each-other.

I personally don’t feel HTML makes sense as a native format for storage.

Things like Leaflet are storing information that makes semantic sense for them and it makes sense to me that you’d want to do that instead of trying to represent the semantic needs by specifically formatted HTML that is supposed to look a certain way as well as be non-ambiguous in the way that it encodes leaflet-specific information.


More generally, though, I think lexicons can be a kind of silo and sometimes that’s intentional.

I think that standard.site has done an excellent job of reducing the requirement for interop to something that can be extremely useful for discovery and allows discovering content, such as content that might have come from my completely custom-styled Astro blog, and being able to direct you to the appropriate place to view the content officially.

Eventually maybe they’ll also have inter-operable content formats, but nothing forces to them to right now, and I think that’s good that it’s possible to be different.

In other words, it’s a social problem as much as ( or sometimes more than ) a technical problem a lot of the times, to figure out whether we want to use the same lexicon for things, and it can be healthy I think to allow differences.

3 Likes

Was this talk recorded, or is there a writeup available?

Yep! You can view it here: atmo.rsvp

Search for “social components” on that page, and then since the video isn’t cut yet, you’ll have to skip to about 1 hour and 23 minutes in the video.

Conceptually I get the idea of mapping lexicons to component props, I had the same thought myself, but what he built at inlay.at is so far above my skill level it feels like magic.

1 Like

Strongly agree that HTML is a good canonical reference state for translation into other protocols. We’re building stuff that does that: [Proposal] Protocol-agnostic PDS - #6 by octothorpes.bsky.social

1 Like