[iDC] please make comments regarding semantic overlay term

Danny Ayers danny.ayers at gmail.com
Thu Mar 27 18:32:35 UTC 2008


On 27/03/2008, Paul Prueitt <psp at ontologystream.com> wrote:

> - that bit I'm not so sure about. The thing is, everyone already has their
> own approach to building executable components - LAMP setups, scrappy bits
> of Python, J2EE monstrosities, MS/OS X/Linux desktops. How might one achieve
> a decent level of adoption of a new infrastructure? Surely it'll be better
> to look for existing common interfaces, and build against them, thereby
> reusing what's already out there..?
>
>
> The notion of being unsure about a "complete solution" may be contrasted
> with the mess we have with the current generation of "stuff".
>


Ah, Paul, the glass half-empty again. It may be a mess, but it's a mess of
billions of interconnected systems - and while what there is may be
suboptimal, it's still useful. A key aspect is that standard interfaces are
used, so you are free to choose your own local "stuff".


Any proposed move-ahead, like Cubicon, reality should be questioned, but
> with an open mind that realizes we, and venture capital, have become numb
> with the false advertising that is now characteristic of the IT sector.
>  Being open in mind means looking objectively at what Cubicon is and why it
> is the way it is.
>

I don't disagree, but would note that venture capital doesn't seem yet to
have tired of throwing money at systems that offer little more than shiny
baubles - check some of the material that goes through techcrunch.com.

I must confess to being rather skeptical of some of the claims around the
Cubicon site - though will be happy to be proven wrong. The following did
make me chuckle however: "Cubicon overcomes the complexities of W3C Resource
Description Framework (RDF) <http://www.w3.org/TR/rdf-primer/> and Web
Ontology Language (OWL)
<http://www.w3.org/TR/owl-features/>specifications.". The material
displayed in the diagrams appears at least an
order of magnitude more complex than the Semantic Web technologies.


The key here is that recombinant components are produced from a framework
> having a power to factor processes and data into specific dimensions, and
> thereby provide commonality that arises from user use patterns, rather then
> from programmer behavioral patterns.  We free the marketplace from extra and
> non-useful (from the users side) control and entitlements.  The programmer
> community has to adjust, but the over all value to the world, or information
> systems, economic systems etc is huge.
>


I'll wait and see on that.


The first round of development is provided with IP protection, and thus
> will benefit a small portion of the programmer community, and then after
> this the demand for "programming" should go down radically.  Reuse means
> reuse, yes?
>


Well, such an approach isn't particularly new - it's not far off what
Microsoft have been doing for the past few decades (with some success).


This is what is new, the shift from programmer oriented IT to user oriented
> communication in an infrastructure that empowers collective real time
> behavior and thus "collective intelligence".
>


I believe we are generally heading in that direction, the infrastructure
being the Web.


We are, as a world civilization, just beginning to see this phenomenon with
> the popularity of Obama being captured by a growing community of new voters
> in the US, perhaps to change forever the nature of politics and the mass
> media.  (This is just one example.)
>


I'll wait and see on that too.

In other words -
> [[
> This automated service infrastructure will enable heterogeneous systems to
> effectively communicate and initiate rapid adoption of Service-Oriented
> Architecture (SOA).
> ]]
> - something like that, only my bet would be on using (RESTful) HTTP+RDF to
> leverage the existing Web infrastructure, in other words go for the Semantic
> Web.
>
> For my own stuff I've been gravitating towards an (almost)
> lowest-common-denominator kind of abstraction based around simple agents
> which will typically be comprised of a HTTP client, (access from) a HTTP
> server, a local RDF model and local behaviour. The common interface is the
> (Semantic) Web. Coincidentally I've recently been looking over the old
> IdeaGraph code with a view to seriously componentizing it so I can refactor
> it more closely to this approach.
>
>
> Yes, the lowest common denominator approach.  This is often seen.  But
> like in a college mathematics class, if one teaches to the lowest common
> denominator, the students evolve the instruction in the director of having
> less and less capability.
>


Perhaps I should have phrased that differently - how about "the minimum
necessary to get the job done". The job here being to facilitate
quasi-autonomous agents in the Web environment, in a form that doesn't
introduce unnecessary complexity and leverages existing capabilities.
Nothing new in that per se, but approaches to date I've seen have fallen
down on one aspect or another. I believe the Web architecture essentially
already has the solutions, it's just Web developers tend to slip into
inappropriate abstractions - in no small part due to that "stuff" mentioned
earlier. I should point out that this is just my personal exploration of
potential avenues for making the Web better. I've no desire to go chasing VC
funding, and far prefer the open source community over closed proprietary
development environments. I'm fortunate enough to have a day job that pays
the bills working for a company with similar aims, also working with
Semantic Web technologies.

Cheers,
Danny.

-- 
http://dannyayers.com
~
http://blogs.talis.com/nodalities/this_weeks_semantic_web/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mailman.thing.net/pipermail/idc/attachments/20080327/6a0b63c1/attachment.htm 


More information about the iDC mailing list