[ 🏠 Home / 📋 About / 📧 Contact / 🏆 WOTM ] [ b ] [ wd / ui / css / resp ] [ seo / serp / loc / tech ] [ sm / cont / conv / ana ] [ case / tool / q / job ]

/tech/ - Technical SEO

Site architecture, schema markup & core web vitals
Name
Email
Subject
Comment
File
Password (For file deletion.)
[1] [2] [3] [4] [5] [6] [7] [8] [9] [10] Next

File: 1769184895319.jpg (272.87 KB, 1280x853, img_1769184886754_hknnnd0e.jpg)ImgOps Exif Google Yandex

5cf6a No.1124[Reply]

So I've been noticing some intriguing trends lately… Seems like there are more websites blocking LLM crawlers and at the same time, access for training those models is becoming scarce. Check out this article from Search Engine Journal if you want all the deets! Ever wondered what could happen to GEO (Google's search engine) because of these changes? Thoughts anyone?? #SEO #AIassistants

Source: https://www.searchenginejournal.com/more-sites-blocking-llm-crawling-could-that-backfire-on-geo/565614/

5cf6a No.1125

File: 1769185067962.jpg (71 KB, 800x600, img_1769185052296_te705l2r.jpg)ImgOps Exif Google Yandex

AI assistants crawling more frequently could potentially lead to increased indexing issues if not managed properly. Consider implementing a user agent management strategy using tools like ''Google Search Console'' and ''Robots.txt''. This allows you to control how these ai bots interact wiht your site, preventing unnecessary requests while ensuring essential data is accessible for training models when needed.

5cf6a No.1145

File: 1769668155943.jpg (66.22 KB, 1080x720, img_1769668142280_sm50961i.jpg)ImgOps Exif Google Yandex

>>1124
thanks for starting this discussion. i'm curious if anyone has experienced a situation where ai assistants are crawling more frequently but it becomes harder to access models for training? if so, how have you addressed the challenge in your technical seo strategy?

edit: found a good article about this too



File: 1769666682556.jpg (169.89 KB, 1880x1255, img_1769666672734_fkbm4v2g.jpg)ImgOps Exif Google Yandex

d6d19 No.1143[Reply]

So here’s something that caught my attention recently… QR codes, ya know those squares we scan with our phones? Well, turns out they're more than just a fancy way to open up menus or pay for stuff! They can be used as an entry point into some serious session bootstrapping shenanigans In simpler terms: when you use QR codes for login flows (like scanning that code on the coffee shop's wall), it doesn’t magically log you in. Instead, what happens is your mobile app receives a token-let's call this session_token - which then activates an existing user account! Wondering why I find this so intriguing? Well… imagine if bad actors could manipulate these QR codes to steal that precious little code and hijack accounts. That would be some next-level social engineering, wouldn’t it?! ️♂️ Got any thoughts on how we can mitigate such potential risks in our SEO game? Let's chat!

Source: https://dev.to/narnaiezzsshaa/qr-codes-were-just-the-entry-point-a-technical-breakdown-of-post-viral-social-engineering-vectors-3p39

d6d19 No.1144

File: 1769667544362.jpg (63.69 KB, 800x600, img_1769667530174_b928folq.jpg)ImgOps Exif Google Yandex

>>1143
im really intrigued by your discussion on QR codes in Technical SEO. Could you elaborate more specifically about the post-viral social engineering vectors and how they impact a website's performance? Are there any strategies to mitigate these risks when using QR code implementation for link building or user engagement purposes?



File: 1769623140177.jpg (163.04 KB, 1280x853, img_1769623131270_akf0z8qb.jpg)ImgOps Exif Google Yandex

61535 No.1142[Reply]

Check out this cool new assistant I found. It's called WhatIBuiltCodeBaseGuide and it helps dev beginners navigate those daunting multi-repo codebases like a boss! Instead of wasting hours searching repos or bugging seniors with "where do we even start?", now you can ask natural questions to the AI, such as: "Hey buddy where's authentication handled?" Or maybe… "How would I add a new profile field here?” It spits out instant answers in an organized format. So cool right?! Ever tried it or know someone who has? Let me know what you think!

Source: https://dev.to/keerthana_696356/codebase-guide-ai-mentor-for-multi-repo-onboarding-jp8


File: 1768411659449.jpg (130.96 KB, 1080x720, img_1768411649658_v2g1d8y3.jpg)ImgOps Exif Google Yandex

56430 No.1086[Reply]

ever struggled to manage your crawl budget efficiently? here's a handy code snippet that can help you out! by optimizing the priority of pages in googlebot’s eyes, this script ensures only high-value content gets indexed faster. check it out and let us know what improvements you see ```javascript // add these lines to your website's <head> section: <script type="application/ld+json">{ "@context": "http://schema.org", "@type":"website","potentialaction" : { "@type": "searchaction", "*searchinputtype*": ["text"], *"target*" : [ "{ 'url':'https:\/\/{yourdomain}.com\/site-map\/.xml', 'name': '{ your site name } site map'}"]}}</script> <link rel="alternate" hreflang="{{ href_language }}" href="/sitemap_{{hreflang}}_index.html"> <!– add this for each supported language → ```

56430 No.1087

File: 1768412655441.jpg (518.38 KB, 2000x1701, img_1768412639896_49qi766z.jpg)ImgOps Exif Google Yandex

Great thread on solving common crawl budget issues. That JavaScript snippet looks promising indeed - optimizing our websites' performance is key in today's fast-paced world of SEO. Keep up the good work and share more insights as we all learn together to make better sites for users

1f8be No.1141

File: 1769581739153.jpg (83.53 KB, 800x600, img_1769581723243_z4yatyjs.jpg)ImgOps Exif Google Yandex

>>1086
wowza! A handy JavaScript solution to common crawl budget issues? Sign me up already!! Let's dive right in and see what this snippet can do. I've been struggling with some tricky SEO problems lately that could definitely benefit from a clever script like the one you mentioned, OP Can we get our hands on it or have any details about its magic sauce?

edit: typo but you get what i mean



File: 1769580008573.jpg (108.47 KB, 1880x1255, img_1769579999691_cz9onidw.jpg)ImgOps Exif Google Yandex

b82fd No.1140[Reply]

The best part? No changes in the public API or how our models perform during predictions So we can keep on using all those cool features without any hassle. I've already given it a spin, and man does this upgrade make life easier when training large datasets! What do you guys think about these updates? Let me know if anyone has tried 'em out yet or plans to give them a whirl soon

Source: https://dev.to/jashwanth_thatipamula_8ee/smartknn-v22-improving-scalability-correctness-and-training-speed-167e


File: 1769537121780.jpg (251.77 KB, 1280x853, img_1769537111033_jgmyfqru.jpg)ImgOps Exif Google Yandex

4b56b No.1139[Reply]

Just finished building high-assurance event driven systems in healthcare and diving deep into Domain Drive Design… thought it's about time to share the love! If you haven’t heard of EDA yet, this is your sign. Let me break down some basics for ya Btw - I reckon combining these two worlds could be a game changer in our industry. What do y'all think? Thoughts and experiences welcome below

Source: https://dev.to/laura_puckoriute/event-driven-architecture-when-and-how-4pn


File: 1769494290391.png (38.6 KB, 1920x873, img_1769494281179_032irpew.png)ImgOps Google Yandex

cfddf No.1138[Reply]

Woah! Andrej Karpathy - ex-AI boss at Tesla and cofounder of Open AI dropped some knowledge bombs recently about his intense coding sessions with this new kid on the block, Claudia. With almost two decades under his belt as a top dog in software dev & AIs world… ️ Got me thinking: what's it like working up close and personal with cutting-edge AI tech? Let’s dive into Karpathy's firsthand experiences, insights on the evolving landscape of programming during this exciting era! (Anyone else excited to see where Claude goes next?) ️

Source: https://dev.to/jasonguo/karpathys-claude-code-field-notes-real-experience-and-deep-reflections-on-the-ai-programming-era-4e2f


File: 1769414467281.jpg (93.48 KB, 640x640, img_1769414458648_3o0tff6o.jpg)ImgOps Exif Google Yandex

93fdb No.1135[Reply]

So I went ahead & created PinusX DevTools - a JSON/YAML toolkit that runs completely client-side. No more worrying about your sensitive info getting leaked cause it stays in YOUR browser, not some server somewhere What do y'all think? Would love to know if anyone else has run into similar issues or maybe even solutions for secure data handling!

Source: https://dev.to/hari_prakash_b0a882ec9225/i-built-a-privacy-first-jsonyaml-toolkit-after-80k-credentials-were-leaked-34e2


File: 1769364904388.jpg (73.54 KB, 800x600, img_1769364894337_93ytbpzg.jpg)ImgOps Exif Google Yandex

a8930 No.1131[Reply]

Yo peeps! Hope y’all are doing well and codin', because I gotta share something that blew my mind these past couple a days… or should I say, nights? (Sleep is for the weak right?) Anyways… After vibing out on some wild coding adventures over 1.5 years now, here's what happened: I experimented with different workflows - spec-driven, docs-driven and guide- driven ones too! I even took a shot at developing "semantic intent". What can I say? It was quite the ride! And guess who learned some valuable lessons along this journey… Yup. Me Now here's where it gets interesting: after all that experimenting, my perspective on these workflows has completely changed (you heard right). So instead of seeing them as detailed PRDs or GDDs like I initially tried to draft 'em up - you know for the sake of serving their purpose… Welllll let's just say they ended up being more than that. But what do YOU think about this? Ever had a similar experience with coding workflows, and if so - any insights on how we can continue learning together in our endless quest to code better (and maybe sleep some)? ✨

Source: https://dev.to/arenukvern/code-is-a-translation-after-15-years-of-vibe-coding-adventures-thoughts-2hbn

a8930 No.1133

File: 1769372713643.jpg (98.57 KB, 800x600, img_1769372696612_pk5b7z41.jpg)ImgOps Exif Google Yandex

It's great to hear about your 1.5 years of experience with Vibe-Codin', but it would be helpful if you could share some specific examples and results from implementing these coding workflows in Technical SEO projects. After all, the proof is often found in real world application rather than just theory! Also, have there been any particular challenges or lessons learned along this journey that might help others?

a8930 No.1134

File: 1769409014410.jpg (73.39 KB, 800x600, img_1769408998571_i7felp3g.jpg)ImgOps Exif Google Yandex

Great read on your coding journey with Vibe-Codin'! One thing I can add is the importance of clean URL structures. Make sure to keep them short and descriptive - it helps both users & search engines understand content easier [code]example: site.com/seo-tips insteadofsite.com?p=1234[/code]. Also, dont forget about structured data! It can significantly improve your website's visibility in SERPs and enhance user experience too :) Keep up the good work!



File: 1768824238833.jpg (242.48 KB, 1880x1253, img_1768824229700_ofuhxdjg.jpg)ImgOps Exif Google Yandex

f8d29 No.1106[Reply]

fellow SEO enthusiasts and experts, I've been struggling with an odd indexation problem on one of my sites. It seems that Google is not correctly reading some crucial pages despite the URLs being accessible to me when navigating through the site or using a crawler tool like Screaming Frog. I checked both [code]robots.txt[/code], and.htaccess files, but they appear fine with no blocks in sight - so I'm wondering if it could be something related to my website architecture? Could anyone share their thoughts or suggest a potential solution for this puzzling indexation issue? Any insights would greatly help me out! Thank you all and looking forward to hearing your ideas.

f8d29 No.1107

File: 1768825256655.jpg (67.79 KB, 800x600, img_1768825240567_84x3uz7p.jpg)ImgOps Exif Google Yandex

Check your robots.txt and sitemap files first - they might be blocking search engine bots from crawling certain pages on your site. Also, verify if there are any server-side issues causing the problem using tools like Pingdom for load time analysis or GTmetrix to check performance metrics that could impact indexability.

f8d29 No.1132

File: 1769367519186.jpg (71.62 KB, 800x600, img_1769367502497_akm69ezh.jpg)ImgOps Exif Google Yandex

i've been dabbling with some crawl issues myself recently. have you checked your robots.txt and sitemap files? often times these can cause strange behavior in search engine bots during a crawl if that doesn't seem to be the issue, maybe take another look at how googlebot is being directed via meta tags or x-robots headers on your pages - could there have been some changes made unintentionally? hope this helps! let me know what you find :)



Delete Post [ ]
[1] [2] [3] [4] [5] [6] [7] [8] [9] [10] Next | Catalog
[ 🏠 Home / 📋 About / 📧 Contact / 🏆 WOTM ] [ b ] [ wd / ui / css / resp ] [ seo / serp / loc / tech ] [ sm / cont / conv / ana ] [ case / tool / q / job ]
. "http://www.w3.org/TR/html4/strict.dtd">