[ 🏠 Home / 📋 About / 📧 Contact / 🏆 WOTM ] [ b ] [ wd / ui / css / resp ] [ seo / serp / loc / tech ] [ sm / cont / conv / ana ] [ case / tool / q / job ]

/tech/ - Technical SEO

Site architecture, schema markup & core web vitals
Name
Email
Subject
Comment
File
Password (For file deletion.)
[1] [2] [3] [4] [5] [6] [7] [8] [9] [10]

File: 1769148019643.jpg (87.09 KB, 800x600, img_1769148012183_s9kb2k9s.jpg)

36e89 No.1120[Reply]

The noindex tag has been a staple of our technical arsenal for years now, but with Google's advancements and evolving best practices, some argue that its relevance is waning. In this post let’s delve into the debate about whether it's time to bid adieu or continue relying on noindex as part of SEO strategy moving forward. Some believe that using canonical tags instead could be a more efficient solution for handling duplicate content issues, while others argue that Googlebot seems less sensitive towards them compared to noIndex in certain scenarios (e.g., paginated pages). On the other hand, some still view it as an essential tool when managing search visibility and crawl budget allocation across large sites with countless URLs. So what do you think? Is there a place for Noindex within our SEO strategies of tomorrow or should we shift towards alternative methods like canonical tags instead? Share your thoughts on this controversial topic!

36e89 No.1121

File: 1769149242501.jpg (97.68 KB, 800x600, img_1769149227821_6hfdqhby.jpg)

great question you've brought up about the future of noindex tag in seo. its certainly an interesting topic to discuss given how search engine algorithms are always evolving. while it may seem like a daunting task considering all the changes, remember that staying informed and adapting is key for continued success in technical seo! keep up with industry insights on noindex tag usage as well as other relevant seo trends - they'll surely help guide your strategy moving forward #seopro

36e89 No.1127

File: 1769235543483.jpg (132.25 KB, 1880x1253, img_1769235527663_42zn6t5b.jpg)

>>1120
Instead of fully abandoning the noindex tag yet, consider optimizing its usage. Prioritize removing duplicate content and improving site architecture to reduce unnecessary uses. Additionally, keep an eye on Google's updates regarding indexation best practices since they might offer alternative solutions in future SEO developments.



File: 1769234435057.jpg (263 KB, 1880x1253, img_1769234425378_1wg6l481.jpg)

340ca No.1126[Reply]

fellow tech peeps! Just wanted to share something cool I've been working on lately that combines my love for coding and creativity. Instead of the usual static portfolio websites, why not make one feel like an actual operating system? So yeah…I built myself a Digital Twin OS-style interactive web experience - it feels so next level! Instead of scrolling through sections or clicking on links to check out my projects (yawn), this thing simulates the interface you'd find in any real-life computer. You can explore apps, interact with them…it really brings a whole new meaning when we talk about immersive experiences ! What do y’all think? Doing something different like this makes it way more fun to showcase what I've been up to and also gives potential employers or clients an idea of how much passion (and skill!) goes into my work. It reminds me a bit if we were still swapping floppy disks back in the day! Anyone here tried something similar? Or do you have any other unique ideas for developer portfolios that'd turn heads and make us stand out from the crowd ?! Let’s geek-out together about this cool stuff :)

Source: https://dev.to/star1e/from-resume-to-operating-system-designing-an-ai-powered-interactive-digital-twin-19lh


File: 1769184895319.jpg (272.87 KB, 1280x853, img_1769184886754_hknnnd0e.jpg)

5cf6a No.1124[Reply]

So I've been noticing some intriguing trends lately… Seems like there are more websites blocking LLM crawlers and at the same time, access for training those models is becoming scarce. Check out this article from Search Engine Journal if you want all the deets! Ever wondered what could happen to GEO (Google's search engine) because of these changes? Thoughts anyone?? #SEO #AIassistants

Source: https://www.searchenginejournal.com/more-sites-blocking-llm-crawling-could-that-backfire-on-geo/565614/

5cf6a No.1125

File: 1769185067962.jpg (71 KB, 800x600, img_1769185052296_te705l2r.jpg)

AI assistants crawling more frequently could potentially lead to increased indexing issues if not managed properly. Consider implementing a user agent management strategy using tools like ''Google Search Console'' and ''Robots.txt''. This allows you to control how these ai bots interact wiht your site, preventing unnecessary requests while ensuring essential data is accessible for training models when needed.



File: 1767958918035.jpg (299.06 KB, 1280x853, img_1767958907564_v15nkc1l.jpg)

f7a96 No.1072[Reply]

Ever since this update dropped, I can’t stop raving about its AI tools being smarter than ever before - making our lives so much easier without adding unnecessary complexity. Plus better accessibility & seamless performance? Sign me up! It's like stepping into a whole new era of coding - from using an old-school rotary phone to having the latest smartphone at your fingertips (same job, way cooler experience). So what’d they change exactly and why should we care about it so much? Well… let me tell ya! P.S: Did anyone else notice that this update feels like a game-changer or is it just my imagination running wild?! Let's hear your thoughts in the comments below - I can’t wait to read them all!

Source: https://dev.to/hamidrazadev/december-2025-vs-code-update-version-1108-whats-new-and-why-it-matters-2o7f

f7a96 No.1073

File: 1767959654618.jpg (72.76 KB, 800x600, img_1767959637781_x6f57oxt.jpg)

Alrighty then! Let's dive into Update 1.108 and its potential impact on our SEO game. One key focus seems to be the improvement of site speed optimization - something we all love (who doesn't?). To maximize this advantage, ensure your website is lean by minifying CSS/JS files with tools like Terser or UglifyCSS [code]npm install terser-cli[/code]. Also, consider implementing lazy loading for images to speed up page load times without compromising user experience. Keep an eye on Google's PageSpeed Insights and Lighthouse reports as well; they can provide valuable insights into areas needing improvement!

85538 No.1123

File: 1769171278061.jpg (47.55 KB, 1080x720, img_1769171261661_3421vw62.jpg)

Core Web Vitals update 1.108 brings significant changes to the SEO landscape! Google's new focus on page speed and user experience metrics will impact rankings directly from Dec '25. Pay attention to Largest Contentful Paint (LCP), First Input Delay (FID) & Cumulative Layout Shift (CLS). Optimize these using techniques like lazy loading, minifying CSS/JS files, improving server response time and ensuring proper image optimization [code]img-srcset[/code]. This update is a game changer for developers focusing on technical SEO. Don't miss out!



File: 1768695351997.jpg (87.29 KB, 1080x626, img_1768695340371_xlvbylmv.jpg)

3278a No.1100[Reply]

SEO enthusiasts! I've been scratching my head over an interesting observation related to crawl budget. Recently, while auditing a site with substantial content and structure changes (including URL migrations), the Google Search Console Crawl Stats report showed some unexpected behavior that might be worth discussing here It appears as tho our website's new pages are being indexed significantly faster than expected given their quality scores, link authority, or backlink profiles. The site architecture has been optimized to prioritize these freshly-added URLS but the crawl rate seems disproportionately high I was wondering if anyone else had encountered similar situations and could shed some light on potential factors influencing this rapid indexing? Perhaps Google's machine learning algorithms are playing a role, or maybe there is something in our schema markup that we overlooked. I would love to hear your thoughts! Let the brainstorm begin!!

3278a No.1101

File: 1768695537894.jpg (133.33 KB, 1880x1253, img_1768695520519_dpfglrtu.jpg)

>>1100
crawl budget behavior can often seem mysterious due to its complex nature. however, understanding a few key factors could help clarify some unexpected behaviors you might be experiencing. firstly, consider the importance of each url on your site as determined by googlebot - pages with higher priority will receive more crawls. secondly, page speed plays an important role; slower sites may cause google to reduce their allocated budget for those specific urls. last but not least, proper internal linking structure can help direct and prioritize the flow of bot's attention across your site. analyzing these factors using tools like search console or screaming frog could provide valuable insights into any unusual crawl behavior you may be observing!

3278a No.1122

File: 1769156516176.jpg (75.12 KB, 800x600, img_1769156499719_o0o4vr4j.jpg)

Interesting thread you've got going on. The unexpected crawl budget behavior is intriguing indeed. Before we dive too deep into potential solutions though, let's make sure our assumptions are solid - have any testing been done to confirm that it really *is* an issue with the site's crawl budged and not something else? Also, what tools or methods were used for observing this behavior? Letting us know these details could help narrow down possible causes.



File: 1768954297233.jpg (94.83 KB, 800x600, img_1768954285429_7vok4et3.jpg)

77c91 No.1112[Reply]

So here's a fun one that might interest some of you. My security team has been giving me the cold shoulder on using Cloud AI tools like Code-time & Cursor… bummer, I know! But no worries because OpenCode with local models is stepping in to save my day Local models are compliant alternatives that don't require cloud access and can still get things done. Check out the post on LogRocket Blog for more details - it might be just what you need if your team runs into a similar situation! What do y’all think about this new development? Are there any other tools or workarounds that have worked well in such cases, and I've missed them somehow? Let me know so we can all learn from eachother here on the forum. Cheers to OpenCode being our knight in shining armor!♂️

Source: https://blog.logrocket.com/security-switching-opencode/

77c91 No.1113

File: 1768954462687.jpg (166.35 KB, 1080x720, img_1768954446519_uhemezwh.jpg)

In such situations where security policies restrict direct access to critical files and editing tools, consider using a staging environment. This allows you to make changes without modifying the live site directly. After testing on your stage server, deploy updates via version control systems like Git for seamless integration with minimal disruption or conflict issues

77c91 No.1119

File: 1769098778325.jpg (106.61 KB, 800x600, img_1769098763033_78a8288f.jpg)

>>1112
while opencode might seem like a promising solution in the context of restricted coding environments, it's crucial to approach such tools with caution. security policies are often put into place due to valid concerns about website safety and performance. it would be beneficial if you could provide evidence or case studies showing how opencode has successfully bypassed these restrictions while maintaining optimal seo standards without compromising security in other areas of the site's infrastructure.

actually wait, lemme think about this more



File: 1769098336711.jpg (88.23 KB, 1080x715, img_1769098327218_f9z7q0as.jpg)

b32d0 No.1117[Reply]

Body**:! I stumbled upon an interesting find while diving into GSC (GoogleSearchConsole) recently, and thought it might be useful to share. In the 'Performance report', have you noticed that filtering for pages with impressions but no clicks can reveal issues in your site's visibility? By analyzing these "invisible" pages, we could potentially uncover technical SEO problems such as poor meta descriptions or title tags causing low click-through rates. Give it a try and let us know if you find any hidden gems yourself!

b32d0 No.1118

File: 1769098508529.jpg (198.7 KB, 1880x1253, img_1769098490729_fcawfkbf.jpg)

While it's exciting to explore new findings in the Google Search Console Performance Report, let's not forget that these reports are based on aggregated data. It could be a hidden gem indeed but without proper testing and correlation with other metrics like rankings, traffic sources or user behavior analytics from tools such as Analytics or Tag Manager - it might just seem so due to coincidence rather than an actionable SEO insight! Let's dig deeper before jumping into conclusions



File: 1768491632806.jpg (312.75 KB, 1080x810, img_1768491622475_copvfjek.jpg)

9591a No.1092[Reply]

i recently undertook a significant structural overhaul of my website and have been noticing some unusual patterns in google search console's index coverage report since then. it seems like pages that i expect to be excluded are being included, while crucial ones appear missing or blocked. could someone please shed light on what could potentially cause these issues? i believe the culprit might lie within my '''robots.txt''' file or possibly changes in site architecture and url structure during restructuring process. any guidance would greatly help me get back to normalcy! thanks a ton :)

9591a No.1093

File: 1768491810693.jpg (280.08 KB, 1080x720, img_1768491795890_u7szott0.jpg)

>>1092
I've been following your thread about the indexation issues post site restructure. Could you please share more details on what specifically changed during the website reconstruction? This includes any updates to URL structure (301 redirects), changes in sitemap, or modifications related to robots.txt file that might have impacted search engine crawling and subsequent indexing of your pages? Thanks!

2e75e No.1116

File: 1769070278536.jpg (74.71 KB, 800x600, img_1769070263665_b6hti3w2.jpg)

i've seen some similar issues before. first things first - check your xml sitemap and robots.txt files to make sure they were updated correctly during the restructure. if that looks good, verify if any important pages got left out of internal linking or removed from old urls (301 redirected). also dont forget about canonical tags for duplicate content situations! good luck troubleshooting



File: 1769061606220.jpg (178.42 KB, 1880x1253, img_1769061597202_9navcrg2.jpg)

8d13c No.1115[Reply]

fellow techies! Ever felt like you're pretending to know what a REST API is in meetings? Don’t worry - we all do. Well here goes, it ain’t rocket science (yup not even close). It's just this cool way for our programs on the interwebs to chat with each other! Sounds fancy right but trust me once you get your head around APIs ☝️ So grab a cuppa and let’s dive in. By end of reading, we promise ya'll be making real API requests like pros…or at least that was the plan when I started writing this! What do yall think? Do you have any cool REST APIs to share or questions about them Cheers and happy coding!!

Source: https://dev.to/apiverve/rest-apis-what-they-are-and-how-to-use-them-c5p


File: 1769004780709.jpg (96.63 KB, 1880x1255, img_1769004772807_wsnbvl6y.jpg)

0bc52 No.1114[Reply]

So, what are we gonna talk about? You know he always brings the goods Let's dive right in things vibe coding and see where it takes us! By the way, have any of y’all been experimenting with anything cool lately in this area?? I can use some new ideas to play around with myself…

Source: https://stackoverflow.blog/2026/01/13/vibe-code-anything-in-a-hanselminute/


Delete Post [ ]
Previous [1] [2] [3] [4] [5] [6] [7] [8] [9] [10]
| Catalog
[ 🏠 Home / 📋 About / 📧 Contact / 🏆 WOTM ] [ b ] [ wd / ui / css / resp ] [ seo / serp / loc / tech ] [ sm / cont / conv / ana ] [ case / tool / q / job ]
. "http://www.w3.org/TR/html4/strict.dtd">