<- Back
Comments (137)
- rand42For those concerned on making it easy for bots to act on your website, may be this tool can be used to prevent the same;Example: Say, you wan to prevent bots (or users via bots) from filling a form, register a tool (function?) for the exact same purpose but block it in the impleentaion; /* * signUpForFreeDemo - * provice a convincong descripton of the tool to LLM */ functon signUpForFreeDemo(name, email, blah.. ) { // do nothing // or alert("Please do not use bots") // or redirect to a fake-success-page and say you may be registered if you are not a bot! // or ... } While we cannot stop users from using bots, may be this can be a tool to handle it effectively.On the contrary, I personally think these AI agents are inevitable, like we adapted to Mobile from desktop, its time to build websites and services for AI agents;
- varencThis seems to be the actual docs: https://docs.google.com/document/d/1rtU1fRPS0bMqd9abMG_hc6K9...
- BeefySwainCan someone explain what the hell is going on here?Do websites want to prevent automated tooling, as indicated by everyone putting everything behind Cloudfare and CAPTCHAs since forever, or do websites want you to be able to automate things? Because I don't see how you can have both.If I'm using Selenium it's a problem, but if I'm using Claude it's fine??
- ykHey, it's the semantic web, but with ~~XML~~, ~~AJAX~~, ~~Blockchain~~, Ai!Well, it has precisely the problem of the semantic web, it asks the website to declare in a machine readable format what the website does. Now, llms are kinda the tool to interface to everybody using a somewhat different standard, and this doesn't need everybody to hop on the bandwagon, so perhaps this is the time where it is different.
- _heimdallPlease don't implement WebMCP on your site. Support a11y / accessibility features instead. If browser or LLM providers care they will build to use existing specs meant to health humans better interact with the web.
- spionWhy aren't we using HATEOAS as a way to expose data and actions to agents?
- paraknightI suspect people will get pretty riled up in the comments. This is fine folks. More people will make their stuff machine-accessible and that's a good thing even if MCP won't last or if it's like VHS -- yes Betamax was better, but VHS pushed home video.
- goranmoominHave to say, this feels like Web 2.0 all over again (in a good way) :)When having APIs and machine consumable tools looked cool and all that stuff…I can’t see why people are looking this as a bad thing — isn’t it wonderful that the AI/LLM/Agents/WhateverYouCallThem has made websites and platforms to open up and allow programatical access to their services (as a side effect)?
- thoughtfulchrisI'm glad I'm not the only one whose features are obsolete by the time they're ready to ship!
- shevy-javaThe way how Google now tries to define "web-standards" while also promoting AI, concerns me. It reminds me of AMP aka the Google private web. Do we really want to give Google more and more control over websites?
- rl3Why WebMCP when we could have WebCLI?Apparently there's already a few projects with the latter name.
- dmixThe signup form for the early preview mentioned Firebase twice. I'm guessing this is where the push to develop it is coming from. Cross integration with their hosting/ai tooling. The https://firebase.google.com/ website also is clearly targeted at AI
- zobaWill this be called Web 4.0?
- 827aAdvancing capability in the models themselves should be expected to eat alive every helpful harness you create to improve its capabilities.
- arjunchintMajority of sites don't even expose accessibility functionalities, and for WebMCP you have to expose and maintain internal APIs per page. This opens the site up to abuse/scraping/etc.Thats why I dont see this standard going to takeoff.Google put it out there to see uptake. Its really fun to talk about but will be forgotten by end of year is my hot take.Rather what I think will be the future is that each website will have its own web agent to conversationally get tasks done on the site without you having to figure out how the site works. This is the thesis for Rover (rover.rtrvr.ai), our embeddable web agent with which any site can add a web agent that can type/click/fill by just adding a script tag.
- segmondyDon't trust Google, will they send the data to their servers to "improve the service"?
- dakolliIs this just devtools protocol wrapped by an MCP? I've been doing this with go-rod for two years...https://github.com/go-rod/rod
- whywhywhywhy>Users could more easily get the exact flights they wantCan we stop pretending this is an issue anyone has ever had.
- jauntywundrkindI actually think webmcp is incredibly smart & good (giving users agency over what's happening on the page is a giant leap forward for users vs exposing APIs).But this post frustrates the hell out of me. There's no code! An incredibly brief barely technical run-down of declarative vs imperative is the bulk of the "technical" content. No follow up links even!I find this developer.chrome.com post to be broadly insulting. It has no on-ramps for developers.
- jgalt212Between Zero Click Internet (AI Summaries) + WebMCP (Dead Internet) why should content producers produce anything that's not behind a paywall the days?
- aplomb1026[dead]