Video Embedder with transcription (SEO) The latest generation of Video player for SEO performance

The Vidseo plugin was designed to revolutionize the way video is used on the Web in general.

The main problem associated with videos is that their content, which is narrated in the video, does not constitute indexable material by search engines (and this, despite a video sitemap).

In other words,

  • when you embed a video on Youtube or Vimeo, the most important elements allowing it to generate organic visibility (and thus to be visible in the SERPs), are the title and the description you integrate in the video.
  • When you embed a video from Youtube or Vimeo on your website, the only positive factor that will be generated, in terms of SEO, is the session time on your different pages. Which, for a given keyword, can increase your ranking in general.

Video is a very popular material on the Web, in terms of conversion. However, as it stands, its effectiveness in terms of organic SEO, and therefore performance in the SERPs, is one of the weakest.

Now, imagine that all the content developed in the video, in text form, becomes usable / indexable when you embed this video in your web page? Imagine you are using a 5-10 minute video showing your service offering & your products and everything you say (in this video), in addition to what your page contains (as text), could be integratd textually on your page, elegantly and discreetly (UX)?

 This is exactly what Vidseo allows you to do.

Now imagine the COLOSSAL amount of text that could be used from all videos published on Youtube and Vimeo, considering that EVERY minute, over 500 hours of content is put online? We are talking here about ORIGINAL content, never indexed beforehand by search engines.

Let’s go even further. Now imagine, using this type of feature, a website consisting only of videos? No more texts, no more images, … only videos, integrating video transcripts to generate organic SEO.

How do you think search engines will consider your page? It’s a safe bet that they will see a top quality reference given the amount of original content related to your service offer.

How to skyrocket your SEO Website with video transcripts?

Vidseo plugin offers a completely new way of generating top quality content for your SEO because it allows you to integrate your transcripts automatically generated by Youtube into your videos (or all the ones you find), but it also allows you to edit, correct, optimize and format them (with markup, layout), elegantly and fully indexable, without harming user experience (UX).

The advantage of Vidseo is that it allows you to control how these video transcripts are presented to your visitors. You can :

  • Deploy an extract, whose size you can define, which will appear below the video, which on a single click, will display the entire transcript
  • Deploy the entire transcript directly below the video
  • Hide the transcript from your visitors, even if it is directly displayed in your web page’s (HTML)

The other advantage is to allow the editing of content automatically generated by Youtube (transcription), which is not necessarily reliable. In this sense, you can rewrite the text, format it with paragraph, embed title tags that will help your SEO, as well as links and any other details that you could use in a standard page layout.

Vidseo plugin allows you to do what no other plugin on WordPress or even Youtube(!) is able to do. Allow the effective indexing of the content developed in your videos. Millions of hours of content available to generate SEO like never before!

How to use Vidseo plugin? Skyrocket your SEO with Video transcription

Once installed, go to Vidseo post type section &  create your first shortcode:

  • Copy-paste your Video URL from Youtube/Vimeo
  • Define where this video is from (Youtube/Vimeo)
  • Insert your content or transcription
  • Customize your content – (HTML edition available only with PRO version)
  • Provide a title
  • Copy-paste shortcode to your pages/posts/products
  • Customize your shortcode with attributes (meaning that you can display your video with other setting than defined on VidSEO setting page)

Available Attributes

  • Player width:”XXXpx” (Define width “xxx” in pixels or percentage – Don’t forget to add “px” or “%” a the end)
  • Hide title area: title=”0″ (true or false, 1 or 0)
  • Disable transcript: transcript=”0″ (true or false, 1 or 0 | If you want to use VidSEO as  regular player)
  • Excerpt Length: excerpt=”60″ (characters length in numbers)
  • Hidden Transcript: trans_hidden=”1″ (if you want to hide transcription to users, but keep it in backend for search engines)

Example: [vid-seo id="123" width="600px" transcript="0" title="0"]

Robots.txt explained PRO

 VidSEO PRO with HTML editor 

Robots.txt explained PRO

 VidSEO FREE, no HTML edition 

1. Make your first shortcode (FREE) Provide transcript, add Video URL and select host for Video (FREE version)

VidSEO player is very easy to use. Once your content is added in Textarea (no formatting allowed), simply add your video URL (from your own channel or any other video found on Youtube/Vimeo), make sure to select proper host for video, and save.

VidSEO plugin will provide a shortcode to copy-paste in your page, post, product, … Please note that custom attributes (for shortcode) are available with FREE version.

2. Make your first shortcode (PRO) Provide transcrip, format it with headings, links, images, add Video URL and select host for Video (PRO version)

VidSEO plugin PRO provides much more flexibility. You have access to Classic editor to format your content with Headings, links, images and more. Also, you can choose to hide transcript to users (on frontend) while the content is available in backend for search engines.

If needed, you also have access to modifyng tital area and transcription box background.

Robots.txt explained PRO

Please note that custom attributes are also available with PRO version.

Robots.txt explained PRO

 VidSEO PRO with HTML editor 

Variations - Vidseo plugin What could be done by VidSEO plugin ?

Now that you understand how to benefit from Vidseo for your own content, by using video transcription to generate organic traffic to your site, imagine the amount of information available on Youtube, and especially, all the content yet to come!

Millions of hours of content that you can use, directly (via the excerpt) or indirectly (by hiding the transcript) to boost your SEO and of course, the visibility of your services and products.

On the Web, the two most important elements these days are:

 

  • Content, because that’s what allows you to generate SEO and organic traffic to your site (Content Marketing)
  • Video,  because that’s what generates the most conversion (and interaction) in general on the web and social media, and which, if integrated into a website, tends to increase session time (positively impacting your ranking in Google for a given SEO). SEE HERE

The content is so important that, in reality, it’s this one that makes sure your video generates SEO (Google, Youtube, …), via the title and description you indicate in Youtube / Vimeo when you create your video. Content is KING, it’s well known.

That’s why the best SEO strategies are always accompanied by content production, because it’s the only thing that allows search engines to understand your site or your web pages, to generate SEO, and to offer them to Internet users when a request coincides with this one.

However, it was never possible to combine text AND video for SEO purposes. Content and video have always been considered as two separate entities, requiring a different strategy.

 

It’s now a thing of the past.

FREE version No formatting / Player width: 500px (default) / Excerpt: 80

Robots.txt explained

Hi. Welcome to this week's web design video blog.Today Nick and I are going to explain the purpose of the robots.tx... ▼

Hi. Welcome to this week's web design video blog.

Today Nick and I are going to explain the purpose of the robots.txt file and also share the common rules that you might want to use to communicate with search engine robots like Googlebot.

So the primary purpose of the robots.txt file is to restrict access to your website by search engine robots or BOTS. The file is quite literally a simple dot txt text file that can be opened and created in almost any notepad HTML editor or word processor. To make a start, name your file robots.txt and add it to the root layer of your website. This is quite important as all of the main reputable search engine spiders will automatically look for this file to take instruction before crawling your website. So here's how the file should look to start with. On the very first line: add "User-agent". This first command essentially addresses the instructions to all search BOTS. Once you've adjust a specific or in our case with the asterisks all search BOTS you come on to the allow and disallow commands that you can use to specify your restrictions. To simply ban the bots from the entire website directory including the home page, you'll add the following code: disallow with a capital of D : space and a forward slash. This first four slash represents the root layer of your website.

In most cases you won't want to stripped your entire website just specific folders or files. To do this you specify each restriction on its own line proceed with the disallow command. In example here you can see the necessary code to strict access to a folder called admin along with everything inside it.

If you're looking to restrict individual pages or files the format is very similar online for we're not restricting their entire secure folder just one HTML file within it. You should bear in mind that these directives are case sensitive. You need to ensure that you specify on your robots.txt file the exact match against the file or folder name of your website. So those are the basics and next we come onto slightly more advanced pattern matching techniques.

These can be quite handy if you're looking to block files or directories in bulk without having lines of commands on your robots file. These bulk commands are known as pattern matching and the most common one that you might want to use would be to restrict the access to all dynamically generated URLs that contain the question mark for example. So if you check out line 5 all you need to do to catch all of these type is the forward slash Asterix and then a question mark symbol. you can also use pattern matching to block access to all directories that begin with admin for example: if you check out line 6 you'll see how we've again use an Asterix to match all directories let's start with admin.

So again if for example you had the following folders on your root directory admin - panel admin - files and admin - secure then this one line on your robots file would block access to all three of these folders as they all start with admin. The follow pattern matching command is the ability to identify and restrict access to all files that end with a specific extension. On line 7 you'll see how the single short command will instruct the search engine BOTS and spiders not to crawl and ideally cache any pages that contain the dot PHP extension. So after your initial forward slash, use the asterisks followed by a full stop and then the extension. To signify an extension instead of a string you can clued the command with the dollar sign. The dollar tells the bots that the file to be restricted must end with this extension. So that's it using those six different techniques you should be able to put together a well optimized robots.txt file that Flags content to search engines that you don't wish to be crawled or cached.

It is important that we point out however that website hackers often frequent robots.txt files as they can indicate where security vulnerability may lie that they might want to throw themselves at. Always be sure to password protect and test the security of your dynamic pages particularly if you're advertising their location on a robots.txt file.

Thank you for watching this week's web design video blog

PRO version Classic editor / Player width: 500px (default) / Excerpt: 80

Robots.txt explained PRO

Click here to see Video transcripts and all the content that could be added to your website from Automatic Youtube Video... ▼

Click here to see Video transcripts and all the content that could be added to your website from Automatic Youtube Video transcription

 

Robots.txt explained PRO

Welcome to this week's web design video blog.

Today Nick and I are going to explain the purpose of the robots.txt file and also share the common rules that you might want to use to communicate with search engine robots like Googlebot.

Robots.txt explained PRO

So the primary purpose of the robots.txt file is to restrict access to your website by search engine robots or BOTS. The file is quite literally a simple dot txt text file that can be opened and created in almost any notepad HTML editor or word processor. To make a start, name your file robots.txt and add it to the root layer of your website. This is quite important as all of the main reputable search engine spiders will automatically look for this file to take instruction before crawling your website. So here's how the file should look to start with. On the very first line: add "User-agent". This first command essentially addresses the instructions to all search BOTS. Once you've adjust a specific or in our case with the asterisks all search BOTS you come on to the allow and disallow commands that you can use to specify your restrictions. To simply ban the bots from the entire website directory including the home page, you'll add the following code: disallow with a capital of D : space and a forward slash. This first four slash represents the root layer of your website.

In most cases you won't want to stripped your entire website just specific folders or files. To do this you specify each restriction on its own line proceed with the disallow command. In example here you can see the necessary code to strict access to a folder called admin along with everything inside it.

If you're looking to restrict individual pages or files the format is very similar online for we're not restricting their entire secure folder just one HTML file within it. You should bear in mind that these directives are case sensitive. You need to ensure that you specify on your robots.txt file the exact match against the file or folder name of your website. So those are the basics and next we come onto slightly more advanced pattern matching techniques.

These can be quite handy if you're looking to block files or directories in bulk without having lines of commands on your robots file. These bulk commands are known as pattern matching and the most common one that you might want to use would be to restrict the access to all dynamically generated URLs that contain the question mark for example. So if you check out line 5 all you need to do to catch all of these type is the forward slash Asterix and then a question mark symbol. you can also use pattern matching to block access to all directories that begin with admin for example: if you check out line 6 you'll see how we've again use an Asterix to match all directories let's start with admin.

So again if for example you had the following folders on your root directory admin - panel admin - files and admin - secure then this one line on your robots file would block access to all three of these folders as they all start with admin. The follow pattern matching command is the ability to identify and restrict access to all files that end with a specific extension. On line 7 you'll see how the single short command will instruct the search engine BOTS and spiders not to crawl and ideally cache any pages that contain the dot PHP extension. So after your initial forward slash, use the asterisks followed by a full stop and then the extension. To signify an extension instead of a string you can clued the command with the dollar sign. The dollar tells the bots that the file to be restricted must end with this extension.

So that's it using those six different techniques you should be able to put together a well optimized robots.txt file that Flags content to search engines that you don't wish to be crawled or cached.

It is important that we point out however that website hackers often frequent robots.txt files as they can indicate where security vulnerability may lie that they might want to throw themselves at. Always be sure to password protect and test the security of your dynamic pages particularly if you're advertising their location on a robots.txt file.

Thank you for watching this week's web design video blog

FREE version No formatting / Player width: 600px (custom) / Excerpt: 40 / No title

Hi. Welcome to this week's web design vi... ▼

Hi. Welcome to this week's web design video blog.

Today Nick and I are going to explain the purpose of the robots.txt file and also share the common rules that you might want to use to communicate with search engine robots like Googlebot.

So the primary purpose of the robots.txt file is to restrict access to your website by search engine robots or BOTS. The file is quite literally a simple dot txt text file that can be opened and created in almost any notepad HTML editor or word processor. To make a start, name your file robots.txt and add it to the root layer of your website. This is quite important as all of the main reputable search engine spiders will automatically look for this file to take instruction before crawling your website. So here's how the file should look to start with. On the very first line: add "User-agent". This first command essentially addresses the instructions to all search BOTS. Once you've adjust a specific or in our case with the asterisks all search BOTS you come on to the allow and disallow commands that you can use to specify your restrictions. To simply ban the bots from the entire website directory including the home page, you'll add the following code: disallow with a capital of D : space and a forward slash. This first four slash represents the root layer of your website.

In most cases you won't want to stripped your entire website just specific folders or files. To do this you specify each restriction on its own line proceed with the disallow command. In example here you can see the necessary code to strict access to a folder called admin along with everything inside it.

If you're looking to restrict individual pages or files the format is very similar online for we're not restricting their entire secure folder just one HTML file within it. You should bear in mind that these directives are case sensitive. You need to ensure that you specify on your robots.txt file the exact match against the file or folder name of your website. So those are the basics and next we come onto slightly more advanced pattern matching techniques.

These can be quite handy if you're looking to block files or directories in bulk without having lines of commands on your robots file. These bulk commands are known as pattern matching and the most common one that you might want to use would be to restrict the access to all dynamically generated URLs that contain the question mark for example. So if you check out line 5 all you need to do to catch all of these type is the forward slash Asterix and then a question mark symbol. you can also use pattern matching to block access to all directories that begin with admin for example: if you check out line 6 you'll see how we've again use an Asterix to match all directories let's start with admin.

So again if for example you had the following folders on your root directory admin - panel admin - files and admin - secure then this one line on your robots file would block access to all three of these folders as they all start with admin. The follow pattern matching command is the ability to identify and restrict access to all files that end with a specific extension. On line 7 you'll see how the single short command will instruct the search engine BOTS and spiders not to crawl and ideally cache any pages that contain the dot PHP extension. So after your initial forward slash, use the asterisks followed by a full stop and then the extension. To signify an extension instead of a string you can clued the command with the dollar sign. The dollar tells the bots that the file to be restricted must end with this extension. So that's it using those six different techniques you should be able to put together a well optimized robots.txt file that Flags content to search engines that you don't wish to be crawled or cached.

It is important that we point out however that website hackers often frequent robots.txt files as they can indicate where security vulnerability may lie that they might want to throw themselves at. Always be sure to password protect and test the security of your dynamic pages particularly if you're advertising their location on a robots.txt file.

Thank you for watching this week's web design video blog

PRO version REGULAR Player width: 400px (custom) / No drop-down container / No title / No transcript

PRO version Classic editor / Player width: 550px (custom) / Excerpt=60 / No title

Click here to see Video transcripts and all the content that could be added to your website from Automatic Youtube Video transcription  Robots.txt explained PRO Welcome to this week's web design video blog. Today Nick and I are going to explain the purpose of the robots.txt file and also... ▼

Click here to see Video transcripts and all the content that could be added to your website from Automatic Youtube Video transcription

 

Robots.txt explained PRO

Welcome to this week's web design video blog.

Today Nick and I are going to explain the purpose of the robots.txt file and also share the common rules that you might want to use to communicate with search engine robots like Googlebot.

Robots.txt explained PRO

So the primary purpose of the robots.txt file is to restrict access to your website by search engine robots or BOTS. The file is quite literally a simple dot txt text file that can be opened and created in almost any notepad HTML editor or word processor. To make a start, name your file robots.txt and add it to the root layer of your website. This is quite important as all of the main reputable search engine spiders will automatically look for this file to take instruction before crawling your website. So here's how the file should look to start with. On the very first line: add "User-agent". This first command essentially addresses the instructions to all search BOTS. Once you've adjust a specific or in our case with the asterisks all search BOTS you come on to the allow and disallow commands that you can use to specify your restrictions. To simply ban the bots from the entire website directory including the home page, you'll add the following code: disallow with a capital of D : space and a forward slash. This first four slash represents the root layer of your website.

In most cases you won't want to stripped your entire website just specific folders or files. To do this you specify each restriction on its own line proceed with the disallow command. In example here you can see the necessary code to strict access to a folder called admin along with everything inside it.

If you're looking to restrict individual pages or files the format is very similar online for we're not restricting their entire secure folder just one HTML file within it. You should bear in mind that these directives are case sensitive. You need to ensure that you specify on your robots.txt file the exact match against the file or folder name of your website. So those are the basics and next we come onto slightly more advanced pattern matching techniques.

These can be quite handy if you're looking to block files or directories in bulk without having lines of commands on your robots file. These bulk commands are known as pattern matching and the most common one that you might want to use would be to restrict the access to all dynamically generated URLs that contain the question mark for example. So if you check out line 5 all you need to do to catch all of these type is the forward slash Asterix and then a question mark symbol. you can also use pattern matching to block access to all directories that begin with admin for example: if you check out line 6 you'll see how we've again use an Asterix to match all directories let's start with admin.

So again if for example you had the following folders on your root directory admin - panel admin - files and admin - secure then this one line on your robots file would block access to all three of these folders as they all start with admin. The follow pattern matching command is the ability to identify and restrict access to all files that end with a specific extension. On line 7 you'll see how the single short command will instruct the search engine BOTS and spiders not to crawl and ideally cache any pages that contain the dot PHP extension. So after your initial forward slash, use the asterisks followed by a full stop and then the extension. To signify an extension instead of a string you can clued the command with the dollar sign. The dollar tells the bots that the file to be restricted must end with this extension.

So that's it using those six different techniques you should be able to put together a well optimized robots.txt file that Flags content to search engines that you don't wish to be crawled or cached.

It is important that we point out however that website hackers often frequent robots.txt files as they can indicate where security vulnerability may lie that they might want to throw themselves at. Always be sure to password protect and test the security of your dynamic pages particularly if you're advertising their location on a robots.txt file.

Thank you for watching this week's web design video blog

PRO version Classic editor / Player width: 550px (custom) / Transcript: hidden

Robots.txt explained PRO

How to get Youtube Video transcription 2 simple methods very effective

METHOD 1: generate transcript of a Youtube Video

  • Go to YouTube and open the video of your choice.
  • Click on the More actions button (represented by 3 horizontal dots) located next to the Share button
  • Now click on the Transcript option
  • A transcript of the closed captions will automatically be generated
  • Copy the content and paste it to Vidseo FREE editior or Vidseo PRO visual editor.

METHOD 2: Diycaptions – Youtubevideo captions & subtitles

This second method is quite simple and easy to use. It is third-party software and doesn’t affiliate with YouTube, plus free to use.

  • Step 1: Go to the DIY Captions website: https://www.diycaptions.com/
  • Step 2: Enter the YouTube video URL in the given box and to get the full text of the audio click on Get Text button.
  • Step3 : Copy the content and paste it to Vidseo FREE edition or Vidseo PRO visual editor.