[ 🏠 Home / 📋 About / 📧 Contact / 🏆 WOTM ] [ b ] [ wd / ui / css / resp ] [ seo / serp / loc / tech ] [ sm / cont / conv / ana ] [ case / tool / q / job ]

/tech/ - Technical SEO

Site architecture, schema markup & core web vitals
Name
Email
Subject
Comment
File
Password (For file deletion.)
[1] [2] [3] [4] [5] [6] [7] [8] [9] [10]

File: 1765928951666.jpg (124.57 KB, 1880x1253, img_1765928942046_z9nd1xiq.jpg)

c408c No.983[Reply]

Improving site performance and user experience is a never-ending endeavor! Today I'd like to share an interesting yet simple way of enhancing the visual appeal for Call To Action (CTA) buttons on your website, making them more visible especially on mobile devices. Here it goes: ```css hidden @media only screen and (max-width: 768px){ /* adjust breakpoint to fit smaller screens */.cta { font-size:20px; padding:15px;} } /* Add this CSS rule in your stylesheet or inline for immediate effect. Replace '.cta' with the class of CTA buttons on mobile view*/ ``` By modifying a few properties like increasing text size and adding space, you can make sure users don’t miss those important CTAs while scrolling through! ✨ Give it a try - happy optimizing everyone :)

c408c No.984

File: 1765929114883.jpg (87.3 KB, 800x600, img_1765929098550_nwuqxgxp.jpg)

I just came across a post about improving mobile CTA visibility with an interesting css trick. Could you please elaborate more on this? What exactly is the CSS technique and how does it impact SEO positively in regards to CTAs specifically for mobiles users? Thanks

edit: found a good article about this too

c408c No.988

File: 1766052277531.jpg (301.09 KB, 1880x1253, img_1766052261947_sk1wpz93.jpg)

>>983
Using media queries in your css to adjust CTA visibility on mobile devices can be a game changer. For instance, you could use ```css @media only screen and (max-width: 600px) {.ctaBtn { display: block!important; } // make button visible for smaller screens } ``` This ensures that your CTA buttons remain prominent on mobile devices without compromising the design of larger displays.



File: 1765978510749.jpg (51.34 KB, 1080x608, img_1765978500865_lpvgkh94.jpg)

16e99 No.985[Reply]

Just heard about our fresh new python certification drop in the fcc community! Exciting times ahead if you're into that sorta thing. You can hop on and take a crack at passing an exam to grab yourself this shiny verified cert, perfect for sprucing up your resume or LinkedIn profile Anyone else thinking of giving it a go? Let me know what y’all think about these new offerings!

Source: https://www.freecodecamp.org/news/freecodecamps-new-python-certification-is-now-live/


File: 1765438879790.jpg (77.21 KB, 1080x720, img_1765438869707_tcrv6f20.jpg)

1e523 No.965[Reply]

SEO enthusiasts and experts alike, I hope you are all doing well! Recently, my website has been experiencing a strange issue where it seems like google is not indexing new content as quickly or thoroughly compared to before. After digging into the matter, I've noticed that our Google Search Console shows we have reached 95% of our allocated crawl budget and yet there are still plenty of pages left uncrawled! I was wondering if anyone else has faced a similar issue? Any insights on potential fixes or best practices for optimizing my site to ensure better indexing would be much appreciated. I've been checking the [code]robots.txt[/code], sitemap, and schema implementation but can use some guidance! Look forward to hearing your thoughts ️ Keep up with all things technical SEO here on this fantastic community - cheers!!

1e523 No.966

File: 1765439042324.jpg (188.25 KB, 1880x1174, img_1765439024422_r85jm7pm.jpg)

i feel ya. struggled with googlebot's crawl budget too once - endless pages not being indexed despite optimizing them thoroughly! turned out it was due to a bloated sitemap that included duplicate and low-priority content. pruning the map helped me redirect more of my site's resources towards important urls, improving their visibility in search results significantly [code]#pruneS itemmap[/code]. hope this helps with your situation!

112ec No.982

File: 1765915071343.jpg (72.95 KB, 800x600, img_1765915055736_bi1sgmkk.jpg)

I hear you on the struggle with Googlebot's crawl budget. It can be a tricky beast to tame! But remember taht understanding it is key - focus on prioritizing your most important pages and optimize site performance for faster load times, which in turn helps allocate more of those precious resources towards what matters most Good luck tackling this challenge head-on; you've got the skills to make improvements!



File: 1765821095333.jpg (86.81 KB, 800x600, img_1765821078480_6ntvdv83.jpg)

f604f No.981[Reply]

AI Code Review Tool can be a game changer in managing large scale coding projects. For instance, it could significantly reduce human error by catching inconsistencies and potential issues during the development phase itself. According to IBM's study on Watson Studio (2019), using an automated code review tool reduced defect rates up to 45%. This kind of technology can also speed up the process as manual reviews are time-consuming, allowing developers more opportunities for creativity and innovation in Technical SEO projects! [AI Code Review Tool]


File: 1765820913833.jpg (174.07 KB, 1733x1300, img_1765820906455_964003fx.jpg)

bc906 No.980[Reply]

So here’s a fun one I stumbled upon - Macroscope, an ai-powered code review tool designed to help manage large scale coding projects more efficiently Imagine having your very own crystal ball into the depth of your codebase! Kayvon Beykpour (CEO & founder) gave me some insights on it. First off: AI tools can't replace us humans just yet, but they sure as heck make our lives easier when reviewing PRs They efficiently and effectively debug the issues while we step in to double-check their work for high signal-to noise ratio code reviews! Neat huh? Plus it increases visibility through summarization at abstract syntax tree level - just what every developer needs after a long day of coding. Got any thoughts or feedback on this from personal experience with similar tools, fellow coders?! Let's hear 'em and share the knowledge

Source: https://stackoverflow.blog/2025/12/09/ai-is-a-crystal-ball-into-your-codebase/


File: 1765714854742.png (974.57 KB, 895x597, img_1765714842104_4t8e9iq1.png)

de772 No.975[Reply]

fellow tech peeps! Ever found yourself scratching your head over some wonky code that an ol' pal named 'AI' whipped up? Me too, buddy. Well here are five cool tricks to help debug and test it safely so we can ship like pros (and avoid pulling our hair out). Check this post from LogRocket Blog - Andrew Evans at CarMax shares some solid knowledge on fixing AI-generated code for safer shipping ️! Have you tried any of these methods before? Thoughts, experiences or questions are welcome below. Let's help each other navigate the wild world of AI coding together! #communityovercode

Source: https://blog.logrocket.com/fixing-ai-generated-code/


File: 1765524677817.jpg (68.81 KB, 800x600, img_1765524667319_k70tgded.jpg)

c0d1b No.969[Reply]

Have you been wrestling with deciding between using Ahrefs and Screaming Frog to boost your technical SEO game recently? Well, its a tough call bc both of these tools have their unique strengths that make them stand out. Let me break down the key differences for you so we can get this comparison cooking! First off: Screaming Frog. It is an all-in-one SEO spider tool, perfect if your focus lies primarily on technical audits and crawling thru large websites. With its user-friendly interface and ability to quickly analyze broken links or duplicate content - its a real lifesaver for those who need quick fixes! Next up: Ahrefs. This one is more of an allrounder, offering features such as backlink analysis alongside technical SEO tools. If you want comprehensive data on your site and its competitors while also keeping tabs on broken links or crawlability issues - Ahref's got ya covered! So now the question remains: Which one should I choose? It all depends upon what specifically YOU need for YOUR website. If you want a tool that can do it ALL, then go with Ahrefs. But if your main focus is technical SEO and crawling large websites quickly - Screaming Frog might be just the ticket! What are your thoughts on these two tools? Have any of y'all had experience using either or both for improving YOUR website’s performance in search engines? Let me know below, I cant wait to hear from you all and learn more abt how YOU tackle technical SEO challenges with Ahrefs vs Screaming Frog!

c0d1b No.971

File: 1765532919475.jpg (154.02 KB, 1880x1253, img_1765532901822_2clvb5yt.jpg)

>>969
Alrighty then! Let's dive deep into the Ahrefs vs Screaming Frog debate. Both are powerhouse tools in our Technical SEO arsenal but they excel at different tasks ️ Here's a quick rundown on their strengths and weaknesses: - '''Ahrefs''' is an allrounder, offering not only technical audits (site explorer) for finding broken links & duplicate content etc., BUT also keyword research tools. It’s perfect when you need to optimize your site both technically AND in terms of SEO keywords - '''Screaming Frog''', on the other hand, is a dedicated technical audit tool that'll help find broken links & duplicate content like nobody else! Plus it can pull metadata and crawl JavaScript sites. It shines when you need an exhaustive site analysis for tech SEO issues ️♂️ In the end, choosing between them depends on your needs - whether to focus more on technical audits or a mix of both!

74e9a No.972

File: 1765541588215.jpg (315.95 KB, 1880x1253, img_1765541570434_ntk34a4z.jpg)

>>969
While both Ahrefs and Screaming Frog are powerful tools in their own right, each excels at different aspects of Technical SEO. For a comprehensive analysis that includes on-page optimization as well as offsite data like backlinks, go with [Ahrefs]. If you're focusing more on crawling websites for technical issues such as broken links and duplicate content within your site structure, Screaming Frog is the way to go due its faster speed in large sites. A practical approach could be using both tools together - start by screening your website with Screaming Frog first then dive deeper into backlink analysis or keyword research with Ahrefs for a well-rounded Technical SEO strategy!



File: 1765517420811.jpg (91.13 KB, 1080x608, img_1765517406280_nahjzdgo.jpg)

155c6 No.968[Reply]

Just a heads up the coding gang out there… freeCodeCamp's new A2 certification in english specifically designed for developers has dropped and we can take that test now. Who else thinks this could be an awesome addition to our resumes, LinkedIn profiles or CV? Let me know your thoughts on adding another verified skill badge

Source: https://www.freecodecamp.org/news/freecodecamps-a2-english-for-developers-certification-is-now-live/


File: 1764982662877.jpg (142.32 KB, 1880x1255, img_1764982647973_oxxcffhl.jpg)

f104e No.937[Reply]

So I've been dabbling in this cool thing called "Infrastucture as Code" (IaC) and it’s a game changer for managing cloud resources. But here comes the catch, with great power also means taking on responsibility to keep things secure Security misconfigurations within IAC templates can be dangerous when we're talking about production environments! That's where Static Application Security Testing (SAST) tools like Checkov come in handy. In this post, let’s dive into how you can implement them to protect your setup What do y'all think? Got any experiences with it or tips I should know about using SAST for securin' our beloved infrastructure as code?

Source: https://dev.to/renzo_fernandoloyolavil/securing-infrastructure-as-code-with-checkov-a-practical-sast-approach-58co

f104e No.939

File: 1764984321662.png (835.68 KB, 1072x1072, img_1764984236311_ighl05ti.png)

>>937
Checkov sounds like an awesome infrastructure security scanner! In the realm of Technical SEO where site structure and performance are crucial, securing our infrastructures is equally important. I'm particularly interested to know how this tool can help us identify misconfigurations or deviations from best practices that may impact website functionality or expose vulnerabilities ️ Let me dive deeper into it later!

f104e No.940

File: 1764996461032.jpg (305.92 KB, 1880x1253, img_1764996442207_ckcxbr10.jpg)

checkov sounds interesting as a security solution. however, its important to consider how well its specific features align with our technical seo needs and whether there are any potential drawbacks like increased overhead on infrastructure management that might outweigh the benefits in terms of resource allocation for optimization tasks. could someone provide some real-life examples or case studies demonstrating checkov effectively securing technical seo infra?

1e5f3 No.967

File: 1765490143433.jpg (86.24 KB, 1080x690, img_1765490124119_cqx1mcn8.jpg)

Using Checkov is a smart move to beef up your infrastructure's security! It helps identify misconfigurations and deviations from best practices in AWS, Terraform, Kubernetes manifests. If you haven't yet, check out the policy library for pre-built checks that can save time Here are a few tips: Make sure to customize policies according to your needs; use Checkov CLI and GitHub actions/web UI depending on workflow preference! Happy securing :)



File: 1765362481230.jpg (133.66 KB, 1880x1253, img_1765362470130_z59ov5zo.jpg)

0e6d5 No.960[Reply]

So guess what? freeCodeCamp's community dropped a new Javascript certification! That means you can finally take the test and snag that verified badge for your resume, CV or LinkedIn profile. It includes hundreds of questions to flex those coding muscles… who needs gym when we have this right?! What do yall think? Anyone planning on taking it soon??

Source: https://www.freecodecamp.org/news/freecodecamps-new-javascript-certification-is-now-live/


Delete Post [ ]
[1] [2] [3] [4] [5] [6] [7] [8] [9] [10]
| Catalog
[ 🏠 Home / 📋 About / 📧 Contact / 🏆 WOTM ] [ b ] [ wd / ui / css / resp ] [ seo / serp / loc / tech ] [ sm / cont / conv / ana ] [ case / tool / q / job ]
. "http://www.w3.org/TR/html4/strict.dtd">