Google Camera Go APK download for all Android devices with Gcam Go Ports

Google Camera Go APK download for all Android devices with Gcam Go Ports

Google Camera Go APK download for all Android devices with Gcam Go Ports

Google introduced Android Go last year with Android 10 and the firmware continues to Android 11 this year. Android Go Edition phones are inexpensive and hence feature very low end processors. So much so that Google developed an entirely new OS for the sub $100 phones called the Android Go Edition; or so called the lite version of Android. For instance, the Nokia 1.3 device is a €95 phone that runs of Android Go OS featuring Snapdragon 250 SoC. While the device has an upgrade plan for Android 11 Go and even the Android 12 Go Editions, the most intriguing part about this low-end smartphone is the Google Camera Go APK.
So as for the low-end and even the mid-range Android devices, the Gcam ports mostly won’t work let along the Google Camera APK. This can be solved with the Android Go’s Google Camera Go APK. The Camera Go app comes pre-installed on Nokia 1.3. It features a fair amount of functionalities and settings as compared to the stock Pixel 4 camera app. The Google Camera Go APK interface is minimalistic with 4 options including camera (for photos), video, portrait, and translate. The settings is also a small overlay dialogue box with toggles for flash, timer, and face enhancer.

The Google Camera Go work flawlessly with most of the Android devices out there. So it doesn’t necessarily need a specific Gcam port. This is the Gcam Go port! Simply download to your device, install, and run.

This Google Camera Go App have no any bugs and lacks.

Data Vocabulary Schema Deprecated Breadcrumbs error Fix in blogger

Data Vocabulary Schema Deprecated Breadcrumbs error Fix in blogger

Data Vocabulary Schema Deprecated Breadcrumbs error Fix in blogger by Vishesh Grewal

Importance of Breadcrumbs

Why is it important to have valid breadcrumb structured data markup?
Google Search uses breadcrumb markup in the body of a web page to categorize the information from the page in search results. Given the fact that Google is displaying breadcrumbs more prominently in search results, it’s more important than ever to make sure the markup is valid. Breadcrumb markup can be implemented using JSON-LD, RDFa, or Microdata. It can also be implemented as part of a page’s visual design using HTML.

"How to Fix Breadcrumbs Issue?" 

  • If you get Email from Google mentioning that your posts have breadcrumbs issues.
  • While submitting your post in Google Webmaster tool, you are able to submit the post for indexing, but while testing URL, you get a warning like “Data Vocabulary Schema Deprecated Breadcrumbs error”.
Don’t worry guys, I found an instant solution for this which is 100% WORKING and TESTED. I found this solution because recently i also get a email from "Google Search Console" related Breadcrumbs issue on my Blog.

Steps To Fix Breadcrumbs Issue:-

  •  Go to your blogger and navigate to “Theme” and click “Edit HTML”.
  • Click anywhere in the HTML code window of your blogger theme and search for “.breadcrumbs a:hover” OR “.breadcrumbs span a:hover”. You will definitely find either of these two.
  • Paste this code.

.breadcrumbs svg{width:16px;height:16px;vertical-align:-4px}

.breadcrumbs svg path{fill:#666} 









  • After that Search again for below line in blogger theme HTML
  •  <b:includable id='backlinks' 
    • Paste this code.
    <b:includable id='breadcrumb' var='posts'> 
    <b:if cond='data:blog.pageType == &quot;item&quot;'> 
    <b:loop values='data:posts' var='post'> 
    <b:if cond='data:post.labels'> 
    <div class='breadcrumbs' itemscope='itemscope' itemtype='https://schema.org/BreadcrumbList'>
     <svg viewBox='0 0 24 24'>
    <path d='M10,20V14H14V20H19V12H22L12,3L2,12H5V20H10Z' fill='#000000'/></svg> 
    <span itemprop='itemListElement' itemscope='itemscope' itemtype='https://schema.org/ListItem'>
     <a expr:href='data:blog.homepageUrl' title='Home' itemprop='item'> 
    <span itemprop='name'>Home</span></a>
     <meta content='1' itemprop='position'/> </span> 
    <svg viewBox='0 0 24 24'><path d='M5.5,9A1.5,1.5 0 0,0 7,7.5A1.5,1.5 0 0,0 5.5,6A1.5,1.5 0 0,0 4,7.5A1.5,1.5 0 0,0 5.5,9M17.41,11.58C17.77,11.94 18,12.44 18,13C18,13.55 17.78,14.05 17.41,14.41L12.41,19.41C12.05,19.77 11.55,20 11,20C10.45,20 9.95,19.78 9.58,19.41L2.59,12.42C2.22,12.05 2,11.55 2,11V6C2,4.89 2.89,4 4,4H9C9.55,4 10.05,4.22 10.41,4.58L17.41,11.58M13.54,5.71L14.54,4.71L21.41,11.58C21.78,11.94 22,12.45 22,13C22,13.55 21.78,14.05 21.42,14.41L16.04,19.79L15.04,18.79L20.75,13L13.54,5.71Z' fill='#000000'/></svg>
     <b:loop index='num' values='data:post.labels' var='label'> 
    <span itemprop='itemListElement' itemscope='itemscope' itemtype='https://schema.org/ListItem'>
     <a expr:href='data:label.url + &quot;?&amp;max-results=16&quot;' expr:title='data:label.name' itemprop='item'>
     <span itemprop='name'><data:label.name/></span> </a> 
    <meta expr:content='data:num+2' itemprop='position'/> </span> 
    <b:if cond='data:label.isLast != &quot;true&quot;'>
     <svg viewBox='0 0 24 24'><path d='M8.59,16.58L13.17,12L8.59,7.41L10,6L16,12L10,18L8.59,16.58Z' fill='#000000'/></svg> 
    </b:if> 
    </b:loop> 
    <svg viewBox='0 0 24 24'><path d='M8.59,16.58L13.17,12L8.59,7.41L10,6L16,12L10,18L8.59,16.58Z' fill='#000000'/></svg> 
    <span><data:post.title/></span> </div> </b:if> </b:loop> </b:if> </b:includable>
    
    How To Add Custom Robots.txt File in Blogger?

    How To Add Custom Robots.txt File in Blogger?

    How To Add Custom Robots.txt File in Blogger?

    What is Robots.txt?

    Robots.txt is a text file which contains few lines of simple code.
    It is saved on the website or blog's server which instruct the web crawlers on how to index and crawl your blog in the search results.
    That means you can restrict any web page on your blog from web crawlers so that it can't get indexed in search engines like your blog labels page, your demo page or any other pages that are not as important to get indexed.
    Always remember that search crawlers scan the robots.txt file before crawling any web page.
    Each blog hosted on blogger has its default robots.txt file which is something look like this: 

    User-agent: Mediapartners-Google
    Disallow:
    User-agent: *
    Disallow: /search
    Allow: /
    Sitemap: http://example.blogspot.com/sitemap.xml

     

    Explanation:-

    This code is divided into three sections. Let's first study each of them after that we will learn how to add custom robots.txt file in blogspot blogs.

    User-agent: Mediapartners-Google

    This code is for Google Adsense robots which help them to serve better ads on your blog. Either you are using Google Adsense on your blog or not simply leave it as it is.

    User-agent: * 

    This is for all robots marked with asterisk (*). In default settings our blog's labels links are restricted to indexed by search crawlers that means the web crawlers will not index our labels page links because of below code.

    Disallow: /search

    That means the links having keyword search just after the domain name will be ignored. See below example which is a link of label page named SEO.

    https://visheshgrewal.blogspot.com/search/label/SEO 

    And if we remove Disallow: /search from the above code then crawlers will access our entire blog to index and crawl all of its content and web pages.

    Here Allow: / refers to the Homepage that means web crawlers can crawl and index our blog's homepage.


    Disallow Particular Post

    Now suppose if we want to exclude a particular post from indexing then we can add below lines in the code.

    Disallow: /yyyy/mm/post-url.html

    Here yyyy and mm refers to the publishing year and month of the post respectively. For example if we have published a post in year 2018 in month of March then we have to use below format.

    Disallow: /2018/03/post-url.html

    To make this task easy, you can simply copy the post URL and remove the blog name from the beginning.


    Disallow Particular Page

    If we need to disallow a particular page then we can use the same method as above. Simply copy the page URL and remove blog address from it which will something look like this:

    Disallow: /p/page-url.html


    Adding Custom Robots.Txt to Blogger

    Now the main part of this tutorial is how to add custom robots.txt in blogger. So below are steps to add it.
    • Go to your blogger blog.
    • Navigate to Settings >> Search Preferences ›› Crawlers and indexing ›› Custom robots.txt ›› Edit ›› Yes
    • Now paste your robots.txt file code in the box.
    • Click on Save Changes button.
    • You are done! 


    How to Check Your Robots.txt File? 

    You can check this file on your blog by adding /robots.txt at the end of your blog URL in the web browser. For example:
    http://xyz.blogspot.com/robots.txt

    Once you visit the robots.txt file URL you will see the entire code which you are using in your custom robots.txt file.



    How to Submit your Site on Google Step-by-step Guide

    How to Submit your Site on Google Step-by-step Guide


    For Getting your Site/Blog on Google search, Here you can get step-by-step guide for Submit your Website/Blog on Google. If your website/blog isn't in Google's index, it won't be able to be found when a user makes a search. Google needs to know that your site exists to be able to crawl it and include it in its index.

    You must have a Google Account for submit your site on Google.

    How to Submit your Website/Blog on Google Search?


    1. Go to Google Webmaster Tool.
    2. Click on Menu Bar (if you use mobile version) and then click on +Add Property.
    3. Now Enter your domain name.
    4. Then you need to verify your domain by Adding CNAME record, upload HTML file on your website file manger(Hosting account required) or by Adding a meta head tag on your site/blog.
    5. Then Google verify your domain and starts crawling your site/blog.

    If you facing any error just leave a comment.