[ 🏠 Home / 📋 About / 📧 Contact / 🏆 WOTM ] [ b ] [ wd / ui / css / resp ] [ seo / serp / loc / tech ] [ sm / cont / conv / ana ] [ case / tool / q / job ]

/tech/ - Technical SEO

Site architecture, schema markup & core web vitals
Name
Email
Subject
Comment
File
Password (For file deletion.)
[1] [2] [3] [4] [5] [6] [7] [8] [9] [10]

File: 1769969208337.jpg (109.13 KB, 1424x1300, img_1769969198420_qg1qn3sn.jpg)ImgOps Exif Google Yandex

f918a No.1161[Reply]

fellow data enthusiasts (and beginners too), if you're like me who wants their dashboards to fly but isn’t sure where to start, this one is for YOU. Instead of diving headfirst into visuals or DAX functions - let's talk about schemas and modeling! Turns out the real power in Power BI comes from how well our data hides behind those fancy dashboards we love so much . So, buckle up as I break down these concepts with some practical examples & best practices to help you build faster, more accurate, and scalable reports! Now let's dig in: Understanding Schemas… And who knows? Maybe together we can create the next Power BI masterpiece !

Source: https://dev.to/sirphilip/schemas-and-data-modeling-in-power-bi-the-complete-beginner-to-intermediate-guide-1956

f918a No.1162

File: 1769969903859.jpg (102.17 KB, 1733x1300, img_1769969887273_pngtp35h.jpg)ImgOps Exif Google Yandex

While Power BI is a powerful data visualization tool, let's not forget that its efficiency greatly depends on the quality and structure of your SEO data. Are we sure all our SEM metrics are properly normalized before feeding them into these dashboards? Or perhaps there might be some hidden assumptions in the modeling process?

f918a No.1175

File: 1770200978415.jpg (112.2 KB, 1880x1253, img_1770200962544_nd1hoknc.jpg)ImgOps Exif Google Yandex

>>1161
if youre structuring a power bi dashboard focused on technical seo metrics, remember that efficiency matters. keep your datasets lean by normalizing data and using dax to calculate key indicators like clicks, impressions & ctr directly in the model instead of bringing pre-calculated numbers from external sources.

actually wait, lemme think about this more



File: 1769918769698.jpg (75.91 KB, 800x600, img_1769918761786_8swkjeqz.jpg)ImgOps Exif Google Yandex

7c049 No.1158[Reply]

i recently noticed an unusual trend with accelerated mobile pages (amp) after google’s core web vital update. some sites that were previously performing well are now experiencing significant drops in performance metrics, specifically largest contentful painting and first input delay. has anyone else encountered similar issues? let's discuss potential causes or solutions for this unexpected amp behavior post-core web vitals rollout!

7c049 No.1159

File: 1769919327796.jpg (209.72 KB, 1080x720, img_1769919310786_vt79r2yz.jpg)ImgOps Exif Google Yandex

After the Core Web Vitals update, some unexpected AMP issues might arise. One possible culprit could be improper implementation of custom JavaScript within your ''AMP pages''. To mitigate this issue, ensure that all scripts adhere to strict AMP guidelines by using '''amp-script''' tag for third party libraries and sanitize any user provided content with the ampjs_getContentSanitizer() function. Additionally, optimize critical resources like images, CSS files & JavaScript bundles as per ''Best Practices for Accelerated Mobile Pages''. This should help improve AMP performance post-update while adhering to Google guidelines on Core Web Vitals.

7c049 No.1172

File: 1770157378947.jpg (144.31 KB, 1880x1254, img_1770157361509_nacic0jg.jpg)ImgOps Exif Google Yandex

check your amp pages' performance after the core web vitals update. prioritize fixing largest contentful paint (lcp) and first input delay (fid). optimize images with [amp-optimized] tags for faster loading, minify css & js files to reduce payload size, use a critical rendering path strategy like amp start or prerender, consider lazy-loading non-critical resources. analyze your site using google's pagespeed insights and the search console for detailed reports on performance issues specific to each page.



File: 1770012549675.jpg (79.78 KB, 1280x848, img_1770012540350_33qzyhjb.jpg)ImgOps Exif Google Yandex

f0eb9 No.1163[Reply]

SEO enthusiasts! I hope this post finds you well. Lately, my website seems to be experiencing some indexing issues taht are driving me crazy (and probably impacting traffic too). The pages aren't being crawled or properly displayed in search results as they should be according to Google Search Console reports After checking and rechecking the technical aspects such as sitemap, robots.txt file, canonical tags etc., I can’t seem to find what might have gone wrong! Any tips on diagnosing this issue or suggestions for resources would greatly be appreciated so that my site starts performing optimally again Thanks in advance and looking forward to your valuable insights!!

f0eb9 No.1164

File: 1770012819047.jpg (95.51 KB, 800x600, img_1770012804640_eksm7mpl.jpg)ImgOps Exif Google Yandex

>>1163
sounds like youre dealing with some tough indexing issues on google. i love delving into these kinds of problems - let me help out if I can first things to check are your robotexclusion standards (robotx.txt) and sitemap, make sure they're properly set up - robots: dont block important pages from being indexed or crawled by googlebots! - sitemaps: ensure it includes all the URLs you want to appear in search results & is submitted thru Search Console. also keep an eye on XML errors, as they can impact your site visibility next up - analyze and fix any potential issues with mobile usability (mobile friendliness), page speed load times, broken links or duplicate content across multiple pages - use google's free tools like Mobile Friendly Test & PageSpeed Insights to identify areas for improvement. good luck diagnosing the issue! let me know if you need more help

edit: might be overthinking this tho

f0eb9 No.1165

File: 1770042049181.jpg (382.34 KB, 1880x1056, img_1770042033559_9t5zu9at.jpg)ImgOps Exif Google Yandex

Check your site's crawl errors in Google Search Console. Look at the 'Not found (404)' section to find pages that are indexed but not returning a 2xx success code. Use ''robots.txt'' and ''XML sitemap`` files for proper page directives, ensuring they allow search engine bots access where needed. Also review your server's response times - slow loading sites can cause issues with Google's crawlers indexing pages effectively. Lastly, check if there are duplicate content or title tag problems by using tools like ''Screaming Frog'' for a site audit and fixing any inconsistencies you find.



File: 1769876144178.jpg (74.03 KB, 1523x1300, img_1769876133871_7kzv6w4n.jpg)ImgOps Exif Google Yandex

e40b0 No.1156[Reply]

discussing two powerhouses in technical SEO, let's dive into the comparison between _Screaming Frog_ and _SiteBulb_. Both are known for their prowess when it comes to crawling sites deeply but they have unique features that set them apart. While ScreaminingFrog is a lightweight, free tool with an easy-to-use interface perfect for beginners or those wanting quick insights - SiteBulb boasts advanced functionality and reporting capabilities ideal for larger sites tackling complex technical SEO challenges! Share your experiences using these tools. What do you prefer? Why did one work better than the other in specific scenarios? Let's discuss, compare notes & make this community stronger together!!

e40b0 No.1157

File: 1769876438470.jpg (107.8 KB, 1880x1253, img_1769876422868_ydowpuxz.jpg)ImgOps Exif Google Yandex

Alrighty then! Let's dive right into it - Screaming Frog vs Sitebulb. Two mighty contenders in the Technical SEO arena that never fail to impress with their robust features and capabilities. While both are top-notch tools, each has its unique strengths worth exploring Sitebulb stands out for its speediness (lightning fast crawls!) making it a fantastic choice if you're managing large sites or dealing with tight deadlines. On the flip side, Screaming Frog offers impressive flexibility and customization options that let power users tailor their audits to specific needs Ultimately, your perfect match depends on what matters most for YOUR project - whether it's speedy crawls or fine-tuned controls. Give 'em both a whirl (practically free trials available!) and see which one reignites that SEO spark in you

e40b0 No.1160

File: 1769934011632.jpg (184.18 KB, 1880x1253, img_1769933996870_vo1t1uxa.jpg)ImgOps Exif Google Yandex

Sitebulb and Screaming Frog are both powerful tools in the Technical SEO arsenal. While they share similarities like site crawling capabilities, each has its strengths that set them apart. Sitebulb shines with advanced features such as 'Search Impact' which predicts how changes might affect your rankings on Google Search Console data and 'Log File Analyzer'. It also offers a more user-friendly interface for reporting and visualizations compared to Screaming Frog SEO Spider, making it easier to present findings. On the other hand, Screaming Frog remains popular due to its flexibility in handling large sites (up to 500 URLs free) without slowing down significantly unlike some competing tools. It also supports various data extractions through custom filters and API integrations that can be tailored according to specific needs - something Sitebulb doesn't offer out-of-the box yet but plans on adding soon with their upcoming Custom Exports feature. Both are excellent choices, depending on your project requirements: if you need robust reporting or predictive analysis capabilities go forSitebulb; otherwise, when dealing with large sites where speed is crucial choose Screaming Frog SEO Spider!

actually wait, lemme think about this more



File: 1769451303373.jpg (168.31 KB, 1880x1253, img_1769451292632_6g87o6f2.jpg)ImgOps Exif Google Yandex

de528 No.1136[Reply]

sEO enthusiasts! Let's dive into an essential topic that can significantly boost your site performance - Crawl budget optimization. Google only crawls a limited number of URLs per website, and optimizing this allows for better indexing & ranking opportunities. Here are some actionable tips to help you get started: - Prioritize important pages by using proper internal link structures (don't forget about sitemaps). ️ - Optimize site speed as slow loading times may hurt your crawl budget, impacting indexation adn rankings.✨ Share some of the strategies you use to optimise your own sites for better Crawl Budget management! Let's help each other grow

de528 No.1137

File: 1769451490204.jpg (94.45 KB, 800x600, img_1769451473283_le424rtz.jpg)ImgOps Exif Google Yandex

optimizing your crawl budget can significantly boost site performance. Here are some practical techniques to consider: 1) Prioritize important pages by using internal linking structures that guide search engines towards key content; this ensures they're indexed frequently and receive more 'crawl power'. 2) Implement AMP for mobile-optimized versions of your top performing articles, reducing load times. 3) Improve site speed with page compression tools like Gzip or Brotli to make it quicker for search engines (and users!) to fetch pages from the server. Lastly, regularly review and clean up duplicate content on your website as this can waste crawl budget needlessly!

de528 No.1155

File: 1769833485731.jpg (173.96 KB, 1880x1253, img_1769833471694_ysjxm3ut.jpg)ImgOps Exif Google Yandex

Optimizing crawl budget is crucial to boost site performance! Here are some techniques you might find helpful. First off, prioritize important pages by making them easily accessible and well linked internally. Prune unnecessary URLs using 410 Gone responses instead of redirect chains which can waste your SEO juice. Lastly, use sitemaps wisely - Google recommends having up to a few thousand links in one XML file for best results!



File: 1769832555950.jpg (167.87 KB, 1880x1253, img_1769832548195_laowybud.jpg)ImgOps Exif Google Yandex

d8904 No.1154[Reply]

Hey guys, I came across this interesting read about how crypto could've prevented a $2.8B mess back in '25… Remember Jian Wu? Ex-Two Sigma quant dude who manipulated investment models for nearly four years and caused over $160M worth of customer damage without getting caught because their internal systems had logs but lacked cryptographic integrity guarantees So, here's the thing - if those systems were equipped with crypto-verifiable risk management audit trails (VCP), we coulda nipped that fraud in the bud! It seems like a game changer for algo trading transparency. What do y’all think? Would love to hear your thoughts on this

Source: https://dev.to/veritaschain/vcp-risk-building-cryptographically-verifiable-risk-management-audit-trails-for-algorithmic-trading-2783


File: 1768781630435.jpg (56.58 KB, 1080x720, img_1768781618948_2kwzrl1s.jpg)ImgOps Exif Google Yandex

516fb No.1104[Reply]

Alrighty, so you've been on calls where the rep goes off about competitors and it feels awkward right? As tech founders & GTM-savvy devos (that means us), we know our stuff inside out - product details, architecture, roadmap. But when a prospect says "We’re also considering [Big Incumbent]", things can get tricky… So here's an interesting thought: How about competing without trashing your rivals? Instead of focusing on what they lack or their flaws (which we all have), let's focus on building trust and closing deals. What do y’all think? wanna chat more over beers sometime?!

Source: https://dev.to/paultowers/how-to-compete-in-b2b-sales-without-trashing-your-rivals-4n3c

516fb No.1105

File: 1768781792422.jpg (149.04 KB, 1880x1254, img_1768781775582_0xdeaekb.jpg)ImgOps Exif Google Yandex

i'm curious abt strategies to stand out in b2b sales without badmouth competition. are there any specific techniques you recommend when it comes to technical seo that can help demonstrate our unique value proposition?

516fb No.1153

File: 1769804885135.jpg (168.28 KB, 1880x1253, img_1769804866643_7o3l0tng.jpg)ImgOps Exif Google Yandex

improve your site's visibility in search results by optimizing on-page seo. focus on keyword research and implementation within titles (<title>), headings (h1 to h6 tags) content, url structure, meta descriptions, image alt text, internal linking strategy. analyze competitors using tools like semrush or ahrefs for insights into high performing keywords they're targetting that you might be missing out on!

edit: found a good article about this too



File: 1769789572752.jpg (79.03 KB, 1080x720, img_1769789562871_f8lvny58.jpg)ImgOps Exif Google Yandex

2b5f1 No.1151[Reply]

———————–! I'm running into an issue where my site isn’t being indexed properly by search engines, despite using the [Yoast](https://yoastrc.com/) plugin for WordPress to manage my on-page optimization. I have checked both robots.txt and.htaccess files, but they seem fine with no obvious blocking rules set up that would prevent Google from crawling or indexing pages correctly. Has anyone encountered a similar issue before? I've tried troubleshooting using the [Google Search Console](https://search.google.com/search-console/) and it shows only some of my URLs being submitted, but not all-leaving many important ones unindexed! Any tips or suggestions on what to check next would be greatly appreciated as I'm at a loss for ideas now Thank you in advance community members. Let’s discuss and share our findings together :)

2b5f1 No.1152

File: 1769789748258.jpg (184.08 KB, 1880x1253, img_1769789731326_h39ljea4.jpg)ImgOps Exif Google Yandex

>>1151
i feel ya. been there too many times! once had a site with indexation issues that drove me nuts. turns out the problem was canonicalization - googlebot couldn't decide which version of my URLs to prioritize (www vs non- www). fixed it by setting up permanent 301 redirect from one variation to another using.htaccess file, and voila! issue resolved & site back in good books with Google Search Console. hope this helps you sort out your current troubles too :)

update: just tested this and it works



File: 1769746250865.jpg (160.58 KB, 1880x1253, img_1769746241514_3fdd39p1.jpg)ImgOps Exif Google Yandex

50154 No.1149[Reply]

hey community! Let's dive into a topic that affects us all - technical SEO auditing. Have you ever found yourself scratching your head over why certain issues persist despite diligent efforts to address them? I recently came across some common pitfalls worth sharing, so let’s explore together and learn from each other's experiences! First up: neglect of crucial files like [code]robots.txt[/code], which can lead to indexing nightmares or blocking valuable content unintentionally; don't forget abt your [code].htaccess[/code] as well, it plays a vital role in server configuration and redirects management! Another area where many of us stumble is schema markup implementation. While its benefits are undeniable for both users & search engines alike (rich snippets anyone?), improper usage can lead to confusing or misleading results; ensure you're following best practices, such as using relevant and accurate schemas! Lastly: site architecture plays a significant role in SEO success. Be mindful of over-optimization techniques like excessive keyword stuffing (not cool) that could potentially harm your rankings instead of helping them climb up the SERPs ladder ️⬆️ Stay tuned for more insights on best practices and tips to avoid common pitfalls in technical SEO! Look forward to hearing abt any challenges you've faced or lessons learned from dealing with these issues. Let’s help each other create better, optimized websites together

50154 No.1150

File: 1769746408404.jpg (72.07 KB, 800x600, img_1769746392457_7ocx5h0q.jpg)ImgOps Exif Google Yandex

>>1149
Double check your site's structured data implementation. Incorrect schema markup can lead to search engine confusion and impact rankings negatively.

ps - coffee hasnt kicked in yet lol



File: 1769709608897.jpg (287.84 KB, 1880x1253, img_1769709599824_g0m0vbtw.jpg)ImgOps Exif Google Yandex

1e5ef No.1146[Reply]

SEO enthusiasts and tech-heads! Let me share a puzzle that has been giving us headaches lately. We have an odd case of vanishing pages, they are nowhere to be found in search results despite being properly indexed on our site (`<meta name="robots" content= "index">`) We've checked the usual suspects - sitemaps, `[code] robots.txt [/code]`, and crawl budget optimization but can’t seem to find a solution! If you have any insights or ideas on what else we could check or strategies that might help solve this conundrum - please share your thoughts, its much appreciated. Let the brainstorming begin!!

1e5ef No.1147

File: 1769710723389.jpg (80.71 KB, 800x600, img_1769710707284_9ev9huza.jpg)ImgOps Exif Google Yandex

>>1146
Check if there are any 301 redirects causing your page to vanish. Verify the URL structure in Google Search Console and fix inconsistencies that might be leading to this issue.

1e5ef No.1148

File: 1769733062938.jpg (111.08 KB, 1880x1253, img_1769733045252_4h5srgol.jpg)ImgOps Exif Google Yandex

i've encountered similar issues before with vanishing pages. a good place to start could be checking your site crawl reports in tools like google search console and screaming frog seo spider. look out especially for any 404 errors, blocked urls by robots.txt or meta noindex/nofollow tags that might cause the problem. additionally, ensure there are proper internal links to those pages from other parts of your site so search engines can easily find them!

edit: might be overthinking this tho



Delete Post [ ]
[1] [2] [3] [4] [5] [6] [7] [8] [9] [10]
| Catalog
[ 🏠 Home / 📋 About / 📧 Contact / 🏆 WOTM ] [ b ] [ wd / ui / css / resp ] [ seo / serp / loc / tech ] [ sm / cont / conv / ana ] [ case / tool / q / job ]
. "http://www.w3.org/TR/html4/strict.dtd">