How to get data with JavaScript from another server?

You should check out jQuery. It has a rich base of AJAX functionality that can give you the power to do all of this. You can load in an external page, and parse it's HTML content with intuitive CSS-like selectors.

An example using $.get();

$.get("anotherPage.html", {}, function(results){
  alert(results); // will show the HTML from anotherPage.html
  alert($(results).find("div.scores").html()); // show "scores" div in results
});

For external domains I've had to author a local PHP script that will act as a middle-man. jQuery will call the local PHP script passing in another server's URL as an argument, the local PHP script will gather the data, and jQuery will read the data from the local PHP script.

$.get("middleman.php", {"site":"http://www.google.com"}, function(results){
  alert(results); // middleman gives Google's HTML to jQuery
});

Giving middleman.php something along the lines of

<?php

  // Do not use as-is, this is only an example.
  // $_GET["site"] set by jQuery as "http://www.google.com"
  print file_get_contents($_GET["site"]);

?>

This is rather easy... if you know the 'secret' trick almost nobody shares..

It's called Yahoo yql...

So in order to regain 'power to the user' (and returning to the convenient mantra: 'never accept no'), just use http://query.yahooapis.com/ (instead of a php? proxy serverside script).
jQuery would not be strictly needed.

EXAMPLE 1:
Using the SQL-like command:

select * from html 
where url="http://stackoverflow.com" 
and xpath='//div/h3/a'

The following link will scrape SO for the newest questions (bypassing cross-domain security bull$#!7):
http://query.yahooapis.com/v1/public/yql?q=select%20title%20from%20html%20where%20url%3D%22http%3A%2F%2Fstackoverflow.com%22%20and%0A%20%20%20%20%20%20xpath%3D%27%2F%2Fdiv%2Fh3%2Fa%27%0A%20%20%20%20&format=json&callback=cbfunc

As you can see this will return a JSON array (one can also choose xml) and calling the callback-function: cbfunc.

Indeed, as a 'bonus' you also save a kitten every time you did not need to regex data out of 'tag-soup'.

Do you hear your little mad scientist inside yourself starting to giggle?

Then see this answer for more info (and don't forget it's comments for more examples).

Good Luck!


Write a proxy script that forwards along the http request from your domain, this will bypass the XMLHttpRequest restrictions.

If your using PHP, simply use cURL to request and read the page, then simply spit out the html as if it was from you domain.


update 2018:

You can only access cross domain with the following 4 condition

  • in response header has Access-Control-Allow-Origin: *

Demo

$.ajax({
  url: 'https://api.myjson.com/bins/bq6eu',
  success: function(response){
    console.log(response.string);
  },
  error: function(response){
    console.log('server error');
  }
})
<script src="https://ajax.googleapis.com/ajax/libs/jquery/1.11.1/jquery.min.js"></script>
  • use server as bridge or proxy to the target

Demo:

$.ajax({
  url: 'https://cors-anywhere.herokuapp.com/http://whatismyip.akamai.com/',
  success: function(response){
    console.log('server IP: ' + response);
  },
  error: function(response){
    console.log('bridge server error');
  }
})
<script src="https://ajax.googleapis.com/ajax/libs/jquery/1.11.1/jquery.min.js"></script>
  • using browser addon to enable Allow-Control-Allow-Origin: *
  • disable browser web security

Chrome

chrome.exe --args --disable-web-security

Firefox

about:config -> security.fileuri.strict_origin_policy -> false

end


noob old answer 2011

$.get(); can get data from jsbin.com but i don't know why it can't get data from another site like google.com

$.get('http://jsbin.com/ufotu5', {},
  function(results){  alert(results); 
});

demo: http://jsfiddle.net/Xj234/ tested with firefox, chrome and safari.