Is there a faster way to download a page from the net to a string?

Posted by cphil5 on Stack Overflow See other posts from Stack Overflow or by cphil5
Published on 2010-06-16T06:02:03Z Indexed on 2010/06/16 6:12 UTC
Read the original article Hit count: 190

Filed under:
|
|

I have tried other methods to download info from a URL, but needed a faster one. I need to download and parse about 250 separate pages, and would like the app to not appear ridiculously slow. This is the code I am currently using to retrieve a single page, any insight would be great.

try 
{
    URL myURL = new URL("http://www.google.com");
    URLConnection ucon = myURL.openConnection();
    InputStream inputStream = ucon.getInputStream();
    BufferedInputStream bufferedInputStream = new BufferedInputStream(inputStream);
    ByteArrayBuffer byteArrayBuffer = new ByteArrayBuffer(50);
    int current = 0;
    while ((current = bufferedInputStream.read()) != -1) {
        byteArrayBuffer.append((byte) current);
    }
    tempString = new String(byteArrayBuffer.toByteArray());

} 
catch (Exception e) 
{
    Log.i("Error",e.toString());
}

© Stack Overflow or respective owner

Related posts about android

Related posts about url