Search Results

Search found 20873 results on 835 pages for 'url fetch'.

Page 235/835 | < Previous Page | 231 232 233 234 235 236 237 238 239 240 241 242  | Next Page >

  • How to query range of data in DB2 with highest performance?

    - by Fuangwith S.
    Usually, I need to retrieve data from a table in some range; for example, a separate page for each search result. In MySQL I use LIMIT keyword but in DB2 I don't know. Now I use this query for retrieve range of data. SELECT * FROM( SELECT SMALLINT(RANK() OVER(ORDER BY NAME DESC)) AS RUNNING_NO , DATA_KEY_VALUE , SHOW_PRIORITY FROM EMPLOYEE WHERE NAME LIKE 'DEL%' ORDER BY NAME DESC FETCH FIRST 20 ROWS ONLY ) AS TMP ORDER BY TMP.RUNNING_NO ASC FETCH FIRST 10 ROWS ONLY but I know it's bad style. So, how to query for highest performance?

    Read the article

  • Help me make a jquery AJAXed divs' links work like an iframe.

    - by Dave
    I want to make a few divs on the same page work similar to iframes. Each will load a URL which contains links. When you click on those links I want an AJAX request to go out and replace the div's html with new html from the page of the clicked link. It will be very similar to surfing a page inside an iframe. Here is my code to initially load the divs (this code works): onload: $.ajax({ url: "http://www.foo.com/videos.php", cache: false, success: function(html){ $("#HowToVideos").replaceWith(html); } }); $.ajax({ url: "http://www.foo.com/projects.php", cache: false, success: function(html){ $("#HowToProjects").replaceWith(html); } }); This is a sample of code that I'm not quite sure how to implement but explains the concept. Could I get some help with some selectors(surround in ?'s) and or let me know what is the correct way of doing this? I also want to display a loading icon, which I need to know where the right place to place the function is. $(".ajaxarea a").click(function(){ var linksURL = this.href; // var ParentingAjaxArea = $(this).closest(".ajaxarea");; $.ajax({ url: linksURL, cache: false, success: function(html){ $(ParentingAjaxArea).replaceWith(html); } }); return false; }); $(".ajaxarea").ajaxStart(function(){ // show loading icon });

    Read the article

  • how to connect to MSSQL using activerecord, JDBC, JTDS and Integrated Security

    - by Rob
    As per the above, I've tried: establish_connection(:adapter => "jdbcmssql", :url => "jdbc:jtds:sqlserver://myserver:1433/mydatabase;domain='mynetwork';", :username => 'user', :password=>'pass' ) establish_connection(:adapter => "jdbcmssql", :url => 'jdbc:jtds:sqlserver://myserver:1433/mydatabase;domain="mynetwork";user="mynetwork\user"' ) establish_connection(:adapter => "jdbcmssql", :url => "jdbc:jtds:sqlserver://myserver:1433/mydatabase;domain='mynetwork';", :username=>'user' ) establish_connection(:adapter => "jdbcmssql", :url => "jdbc:jtds:sqlserver://myserver:1433/mydatabase;domain='mynetwork';integratedSecurity='true'", :username=>'user' ) .. and various other combinations. Each time I get: net/sourceforge/jtds/jdbc/SQLDiagnostic.java:368:in `addDiagnostic': java.sql.SQLException: Login failed for user ''. The user is not associated with a trusted SQL Server connection. (NativeException) Any tips? Thanks, activerecord (2.3.5) activerecord-jdbc-adapter (0.9.6) activerecord-jdbcmssql-adapter (0.9.6) jdbc-jtds (1.2.5) jruby 1.4.0 (ruby 1.8.7 patchlevel 174) (2009-11-02 69fbfa3) (Java HotSpot(TM) Client VM 1.6.0_18) [x86-java]

    Read the article

  • A question on webpage representation in Java

    - by Gemma
    Hello there. I've followed a tutorial and came up with the following method to read the webpage content into a CharSequence public static CharSequence getURLContent(URL url) throws IOException { URLConnection conn = url.openConnection(); String encoding = conn.getContentEncoding(); if (encoding == null) { encoding = "ISO-8859-1"; } BufferedReader br = new BufferedReader(new InputStreamReader(conn.getInputStream(),encoding)); StringBuilder sb = new StringBuilder(16384); try { String line; while ((line = br.readLine()) != null) { sb.append(line); sb.append('\n'); } } finally { br.close(); } return sb; } It will return a representation of the webpage specified by the url. However,this representation is hugely different from what I use "view page source" in my Firefox,and since I need to scrape data from the original webpage(some data segement in the original "view page source" file),it will always fail to find required text on this Java representation. Did I go wrong somewhere?I need your advice guys,thanks a lot for helping!

    Read the article

  • Subsonic 3.0 select Query using DateColumn

    - by vineth
    Hi , While selecting records using Date-Field i am facing a problem , my SQL2005 View (ViewOrders) StarDate column Have 4/23/2010 12:00:00 AM 4/23/2010 12:00:00 AM 4/23/2010 12:00:00 AM 4/23/2010 12:00:00 AM 4/23/2010 1:07:00 PM My Code using subsonic 3.0 AMDB ctx = new AMDB(); SqlQuery vwOrd = ctx.Select.From(); vwOrd = vwOrd.And("StartDate").IsGreaterThanOrEqualTo("04/22/2010");//From date vwOrd = vwOrd.And("StartDate").IsLessThanOrEqualTo("04/22/2010");// To Date List cat = vwOrd.ToList(); i can able to fetch only first four records, i can't able to fetch the final record which start date contains(4/23/2010 1:07:00 PM). I think the problem is in the time format.. How can i code in subsonic ,which compare only the date in the date-time column. I Don't need date "Between" method in subsonic, since i can get single date parameter(From date alone). how can i solve this problem. Thnks in advance

    Read the article

  • Post valuse and upload Image to php server in android

    - by lawat
    I am trying to upload image from android phone to php server with additional values,the method is post my php file look like this if($_POST['val1']){ if($_POST['val2']){ if($_FILE['image']){ ...... } } }else{ echo "Value not found"; } I am doing is URL url=new URL("http://www/......../myfile.php"); HttpURLConnection con=(HttpURLConnection) url.openConnection(); con.setDoInput(true); con.setDoOutput(true); con.setUseCaches(false); con.setRequestMethod("POST");//Enable http POST con.setRequestProperty("Connection", "Keep-Alive"); con.setRequestProperty("Content-Type", "multipart/form-data;boundary="+"****"); connection.setRequestProperty("uploaded_file", imagefilePath); DataOutputStream ostream = new DataOutputStream( con.getOutputStream()); String res=("Content-Disposition: form-data; name=\"val1\""+val1+"****"+ "Content-Disposition: form-data; name=\"val2\""+val2+"****" "Content-Disposition: form-data; name=\"image\";filename=\"" + imagefilePath +"\""+"****"); outputStream.writeBytes(res); my actual problem is values are not posting so first if condition get false and else section is executed that is it give value not found please help me

    Read the article

  • Java Applet 411 Content Length

    - by user1903006
    I am new to Java. I wrote an applet with a gui that sends results (int w and int p) to a server, and I get the "411 Length Required" error. What am I doing wrong? How do you set a Content-Length? This is the method that communicates with the server: public void sendPoints1(int w, int p){ try { String url = "http://somename.com:309/api/Results"; String charset = "UTF-8"; String query = String.format("?key=%s&value=%s", URLEncoder.encode(String.valueOf(w), charset), URLEncoder.encode(String.valueOf(p), charset)); String length = String.valueOf((url + query).getBytes("UTF-8").length); HttpURLConnection connection = (HttpURLConnection) new URL(url + query).openConnection(); connection.setRequestMethod("POST"); connection.setRequestProperty("Content-Length", length); connection.connect(); System.out.println("Responce Code: " + connection.getResponseCode()); System.out.println("Responce Message: " + connection.getResponseMessage()); } catch (Exception e) { System.err.println(e.getMessage()); } }

    Read the article

  • How to get a checkout-able revision info from subversion?

    - by zhongshu
    I want to check a svn url and to get the latest revision, then checkout it, I don't want to use HEAD because I will compare the latest revision to others. so I use "svn info" to get the "Last Changed Rev" for the url like this: D:\Project>svn info svn://.../branches/.../path Path: ... URL: svn://.../branches/.../path Repository Root: svn://yt-file-srv/ Repository UUID: 9ed5ffd7-7585-a14e-96b2-4aab7121bb21 Revision: 2400 Node Kind: directory Last Changed Author: xxx Last Changed Rev: 2396 Last Changed Date: 2010-03-12 09:31:52 +0800 but, I found the 2396 revision is not checkout-able, because this path is in a branch copied from trunk, and the 2396 is the revision modified in the trunk. so when I use svn checkout -r 2396, I will get a workcopy for the path in the trunk, then I can not do checkin for the branch. D:\Project>svn checkout svn://.../branches/.../path -r 2396 workcopy ..... ..... D:\Project>svn info workcopy Path: workcopy URL: svn://.../trunk/.../path Repository Root: svn://yt-file-srv/ Repository UUID: 9ed5ffd7-7585-a14e-96b2-4aab7121bb21 Revision: 2396 Node Kind: directory Schedule: normal Last Changed Author: xxx Last Changed Rev: 2396 Last Changed Date: 2010-03-12 09:31:52 +0800 So, my question is how to get a checkout-able revision for the branch path, for this example, I want to get 2397 (because 2397 is the revision which copy occur). And I know "svn log" can get the info, but "svn log" output maybe very long and parse it will be difficult than "svn info". I just want know which revision is the latest checkout-able revision for the path.

    Read the article

  • Simple search form passing the searched string through GET

    - by Brian Roisentul
    Hi, I'd like my Search form to return the following url after submit: /anuncios/buscar/the_text_I_searched My form is the following: <% form_for :announcement, :url => search_path(:txtSearch) do |f| %> <div class="searchBox" id="basic"> <%= text_field_tag :txtSearch, params[:str_search].blank? ? "Busc&aacute; tu curso r&aacute;pido y f&aacute;cil." : params[:str_search], :maxlength=> 100, :class => "basicSearch_inputField", :onfocus => "if (this.value=='Busc&aacute; tu curso r&aacute;pido y f&aacute;cil.') this.value=''", :onblur => "if(this.value=='') { this.value='Busc&aacute; tu curso r&aacute;pido y f&aacute;cil.'; return false; }" %> <div class="basicSearch_button"> <input type="submit" value="BUSCAR" class="basicSearch_buttonButton" /> <br /><a href="#" onclick="javascript:jQuery('#advance').modal({opacity:60});">Busqueda avanzada</a> </div> </div> <% end %> My routes' line for search_path is this: map.search '/anuncios/buscar/:str_search', :controller => 'announcements', :action => 'search' Well, this will work if I manually type the url I want in the brower, but definitely, if you look at the form's url, you'll find a ":txtSearch" parameter, which is not giving me the actual value of the text field when the form is submitted. And that's what I'd like to get! Could anybody help me on this?

    Read the article

  • Need help fetching an array using prepared statments

    - by eldan221
    I have wrote the following code to fetch a string. But for some reason its only returning 1. I have doubled checked everything and it seems like its correct. I am not sure why its only returning 1? Any help here would be really appreciated! //Class Defined here function MenuCat($id){ $query = "SELECT menu_category_description FROM menu_categories WHERE id = ?"; $stmt = $this->db->prepare($query); $stmt->bind_param("i", $id); $stmt->execute(); $stmt->bind_result($menu_category_description); $row = $stmt->fetch(); return $row; } $display_category = $cat_des->MenuCat($id); echo $display_category

    Read the article

  • Passing along variable inside link works only on first word in multiple-word-variables, why?

    - by Camran
    I have this code below. As you can see I am passing two variables along with the link. The second variabye (category) works whenever it consists of one word, but some categories are two words or more, and then on the receiving php page where I fetch the variable only the first word is fetched. Any ideas? Example: I pass along this: Rims & Tires Only this comes through: Rims $display_table .= "<a href='../ad.php?ad_id=$row[ad_id]&category=$row[category]' target='_parent'>"; Here is how I fetch it in the receiving php file (which the link is to): $cat = $_GET['category']; echo $cat; //displays only first word of multiple word categories. Thanks

    Read the article

  • Notify a content resolver when normal sql insert or update

    - by user1400538
    I have created a small content provider (which I only use to query a db table and fetch some value). I have confirmed that the provider works fine. The table values get updated on a regular basis through normal sql insertions(and not through anycontent provider) Whenever an insert/update or delete occurs through a normal sqlite operation as mentioned above, I need to notify the content resolvers which was written to communicate with the content provider just for quering database and fetch some values. Is this, possible? If yes what need to be done? Any help is appreciated.

    Read the article

  • LIMIT amount of rows fetched by JOIN

    - by user892134
    How do i LIMIT the child rows fetched to only 5? Here is the SQLfiddle http://sqlfiddle.com/#!2/bd96a/2. Right now it fetches all rows with parentid='4' and parentid='14'. It should only fetch 5 of each parentid. Assuming i have hundreds of rows, it should only fetch a max of 5 for each parentid. "SELECT child.* FROM mytable as parent LEFT JOIN mytable as child on child.parentid=parent.id WHERE parent.pageid IN ( 1, 2) AND parent.submittype='1' ORDER BY child.id ASC"; How do i solve this?

    Read the article

  • initWithContentsOfURL seems to have issues with "long" URLs

    - by samsam
    Hi there I'm facing a rather strange Issue when trying to load data from an XML-Webservice. The webservice allows me to pass separated identifiers within the URL-Request. It is therefore possible for the URL to become rather long (240 characters). If I open said URL in firefox the response arrives as planned, if I execute the following code xmlData remains empty. NSString *baseUrl = [[NSString alloc] initWithString:[[[[kSearchDateTimeRequestTV stringByReplacingOccurrencesOfString:@"{LANG}" withString:appLanguageCode] stringByReplacingOccurrencesOfString:@"{IDENTIFIERS}" withString:myIdentifiers] stringByReplacingOccurrencesOfString:@"{STARTTICKS}" withString:[NSString stringWithFormat:@"%@", [[startTime getTicks] descriptionWithLocale:nil]]] stringByReplacingOccurrencesOfString:@"{ENDTICKS}" withString:[NSString stringWithFormat:@"%@", [[endTime getTicks] descriptionWithLocale:nil]]]]; NSLog(baseUrl); //looks good, if openend in browser, returnvalue is ok urlRequest = [NSURL URLWithString:baseUrl]; NSString *xmlData = [NSString stringWithContentsOfURL:urlRequest encoding:NSUTF8StringEncoding error:&err]; //err is nil, therefore i guess everything must be ok... :( NSLog(xmlData); //nothing... is there any sort of URL-Length restriction, does the same problem happened to anyone of you as well? whats a good workaround? thanks for your help sam

    Read the article

  • Performing LINQ Self Join

    - by senfo
    I'm not getting the results I want for a query I'm writing in LINQ using the following: var config = (from ic in repository.Fetch() join oc in repository.Fetch() on ic.Slot equals oc.Slot where ic.Description == "Input" && oc.Description == "Output" select new Config { InputOid = ic.Oid, OutputOid = oc.Oid }).Distinct(); The following SQL returns 53 rows (which is correct), but the above LINQ returns 96 rows: SELECT DISTINCT ic.Oid AS InputOid, oc.Oid AS OutputOid FROM dbo.Config AS ic INNER JOIN dbo.Config AS oc ON ic.Slot = oc.Slot WHERE ic.Description = 'Input' AND oc.Description = 'Output' How would I replicate the above SQL in a LINQ query? Update: I don't think it matters, but I'm working with LINQ to Entities 4.0.

    Read the article

  • Can I keep git from pushing the master branch to all remotes by default?

    - by Curtis
    I have a local git repository with two remotes ('origin' is for internal development, and 'other' is for an external contractor to use). The master branch in my local repository tracks the master in 'origin', which is correct. I also have a branch 'external' which tracks the master in 'other'. The problem I have now is that my master brach ALSO wants to push to the master in 'other' as well, which is an issue. Is there any way I can specify that the local master should NOT push to other/master? I've already tried updating my .git/config file to include: [branch "master"] remote = origin merge = refs/heads/master [branch "external"] remote = other merge = refs/heads/master [push] default = upstream But remote show still shows that my master is pushing to both remotes: toko:engine cmlacy$ git remote show origin Password: * remote origin Fetch URL: <REPO LOCATION> Push URL: <REPO LOCATION> HEAD branch: master Remote branches: master tracked refresh-hook tracked Local branch configured for 'git pull': master merges with remote master Local ref configured for 'git push': master pushes to master (up to date) Those are all correct. toko:engine cmlacy$ git remote show other Password: * remote other Fetch URL: <REPO LOCATION> Push URL: <REPO LOCATION> HEAD branch: master Remote branch: master tracked Local branch configured for 'git pull': external merges with remote master Local ref configured for 'git push': master pushes to master (local out of date) That last section is the problem. 'external' should merge with other/master, but master should NEVER push to other/master. It's never gong to work.

    Read the article

  • Android: Gzip/Http supported by default?

    - by OneWorld
    I am using the code shown below to get Data from our server where Gzip is turned on. Does my Code already support Gzip (maybe this is already done by android and not by my java program) or do I have to add/change smth.? How can I check that it's using Gzip? For my opionion the download is kinda slow. private static InputStream OpenHttpConnection(String urlString) throws IOException { InputStream in = null; int response = -1; URL url = new URL(urlString); URLConnection conn = url.openConnection(); if (!(conn instanceof HttpURLConnection)) throw new IOException("Not an HTTP connection"); try { HttpURLConnection httpConn = (HttpURLConnection) conn; httpConn.setAllowUserInteraction(false); httpConn.setInstanceFollowRedirects(true); httpConn.setRequestMethod("GET"); httpConn.connect(); response = httpConn.getResponseCode(); if (response == HttpURLConnection.HTTP_OK) { in = httpConn.getInputStream(); if(in == null) throw new IOException("No data"); } } catch (Exception ex) { throw new IOException("Error connecting"); } return in; }

    Read the article

  • Pipelining String in Powershell

    - by ChvyVele
    I'm trying to make a simple PowerShell function to have a Linux-style ssh command. Such as: ssh username@url I'm using plink to do this, and this is the function I have written: function ssh { param($usernameAndServer) $myArray = $usernameAndServer.Split("@") $myArray[0] | C:\plink.exe -ssh $myArray[1] } If entered correctly by the user, $myArray[0] is the username and $myArray[1] is the URL. Thus, it connects to the URL and when you're prompted for a username, the username is streamed in using the pipeline. Everything works perfectly, except the pipeline keeps feeding the username ($myArray[0]) and it is entered as the password over and over. Example: PS C:\Users\Mike> ssh xxxxx@yyyyy login as: xxxxx@yyyyy's password: Access denied xxxxx@yyyyy's password: Access denied xxxxx@yyyyy's password: Access denied xxxxx@yyyyy's password: Access denied xxxxx@yyyy's password: Access denied xxxxx@yyyyy's password: FATAL ERROR: Server sent disconnect message type 2 (protocol error): "Too many authentication failures for xxxxx" Where the username has been substituted with xxxxx and the URL has been substituted with yyyyy. Basically, I need to find out how to stop the script from piping in the username ($myArray[0]) after it has been entered once. Any ideas? I've looked all over the internet for a solution and haven't found anything.

    Read the article

  • Getting Top (starting no , Ending no) rows from sqlserver

    - by Krishna Thota
    I'm using Grid view in my asp.net page and I'm fetching data from SQL Server and placing in my grid view. Now my problem is, I'm using paging and placing 50 rows per page in Grid view. I would like to fetch top 50 rows from database in the start up and bind to the Grid View and when i click on next page, then again go to the database and fetch Second Top 50 rows and bind to GV. and this process hasto continue until last row is fetched from Database. Can you tell me How to write Query in achieving this??

    Read the article

  • Passing around objects to network packet handlers ?

    - by xeross
    Hey, I've been writing a networking server for a while now in C++ and have come to the stage to start looking for a way to properly and easily handle all packets. I am so far that I can figure out what kind of packet it is, but now I need to figure out how to get the needed data to the handler functions. I had the following in mind: Have a map of function pointers with the opcode as key and the function pointer as value Have all these functions have 2 arguments, packet and ObjectAccessor ObjectAccessor class contains various functions to fetch various items such as users and alike Perhaps pass the user's guid too so we can fetch it from the objectaccessor I'd like to know the various implementations others have come up with, so please comment on this idea and reply with your own implementations. Thanks, Xeross

    Read the article

  • Time out when creating a site collection

    - by Daeko
    I am trying to create a site collection programmatically. It has worked for about 6 months, but after the servers have been updated (various patches) it doesn’t work anymore (we have 3 servers: 1 development, 1 test, 1 production). It is still working in my development environment which hasn’t been updated, but not on the two others. I don’t receive any error messages, it just hangs at the code that is supposed to add the site collection (see code below). I am using Windows Server 2003 R2 and Sharepoint 2007 (version 12.0.0.6421 ). It doesn’t give me any errors, it just hangs until Internet Explorer comes with a “request timed out” response. If I try and debug the code, the code just stops there and nothing happens. No error messages or anything. public static string CreateSPAccountSite(string siteName) { string url = ""; SPSecurity.RunWithElevatedPrivileges(delegate() { SPWeb web = SPContext.Current.Web; using (SPSite siteCollectionOuter = new SPSite(web.Site.ID)) { SPWebApplication webApp = siteCollectionOuter.WebApplication; SPSiteCollection siteCollection = webApp.Sites; SPSite site = siteCollection.Add("sites/" + siteName, siteName, "Auto generated Site collection.", 1033, "STS#0", siteCollectionOuter.Owner.LoginName, siteCollectionOuter.Owner.Name, siteCollectionOuter.Owner.Email); //Hangs here site.PortalName = "Portal"; site.PortalUrl = mainUrl; // https://www.ourdomain.net url = site.Url; } }); return url; //Should be "https://www.outdomain.net/sites/siteName" }

    Read the article

  • How to parse XML string value using jQuery ?

    - by Vijay
    Hi All, I am new to jquery. I am trying to parse xml string using jquery. I have found one sample code; $(function() { $.get('data.xml', function(d) { var data = ""; var startTag = "<table border='1' id='mainTable'><tbody><tr><td style=\"width: 120px\">Name</td><td style=\"width: 120px\">Link</td></tr>"; var endTag = "</tbody></table>"; $(d).find('url').each(function() { var $url = $(this); var link = $url.find('link').text(); var name = $url.find('name').text(); data += '<tr><td>' + name + '</td>'; data += '<td>' + link + '</td></tr>'; }) $("#content").html(startTag + data + endTag); ; }); }); In this case, I am able to parse and fetch the values from xml file. but now what I am looking for is instead of reading file from desk, I want to read the xml from string. Say, instead of data.xml I want to parse string which consists of well formed xml. does anyone have any idea about this ? Thanks in advance

    Read the article

  • CSS Sprite for images which have vertical as well as horizontal repeats

    - by Rachel
    I have four images, one of which has background repeat property in horizontal direction and three of which have background repeat in vertical direction. I have different CSS classes which currently uses this images as under: .sb_header_dropdown { background: url(images/shopping_dropdown_bg.gif) repeat-y top left; padding: 8px 3px 8px 15px; } .shopping_basket_dropdown .sb_body { background: url(images/shopping_dropdown_body_bg.png) repeat-y top left; margin: 0; padding: 5px 9px 5px 8px; position: relative; z-index: 99999; } .checkout_cart .co_header_left { background: url(images/bg.gif) repeat-x 0 -150px; overflow: hidden; padding-left: 3px; } .sb_dropdown_footer { background: url(images/shopping_dropdown_footer_bg.png) repeat-y top left; clear: both; height: 7px; font-size: 0; } So here am making 4 HTTP Request and I want to implement CSS Sprite for all 4 images such that I can reduce the number of HTTP Request from 4 to 1, also thing to keep in mind is that here we have background repeat for all 4 images, either on x-direction or on y-direction and so how should sprite be created and how it can be used in the CSS to reduce the number of HTTP request. I hope this question is clear.

    Read the article

  • PHP CURL works fine from localhost not from server

    - by Joby Joseph
    I have 2 servers, srv1 and srv2. All client sites are stored in srv2 and all authentication details are stored in srv1. Each time a client site is loaded, the site in srv2 sends a curl request to srv1 to validate. I am always getting bool(false) when I print the curl response using var_dump. But if I am requesting for validation from my local wamp installation(localhost) it is perfectly returning the response. According to my understanding srv2 is blocking srv1 ip or something like that. Any help or suggestions will be greatly appreciated. The solution is really urgent as all my clients are stuck with invalid authorization message. Edit: Now when i tested curl in srv2 is working fine i think. Because I am able to fetch other websites without any trouble. But I am not able to fetch data from srv1. I can view a page from srv1 through browser url but not through curl.

    Read the article

< Previous Page | 231 232 233 234 235 236 237 238 239 240 241 242  | Next Page >