Search Results

Search found 59118 results on 2365 pages for 'data persistence'.

Page 816/2365 | < Previous Page | 812 813 814 815 816 817 818 819 820 821 822 823  | Next Page >

  • Problem in Report Design Layout

    - by Prasanna
    Hi, I have a jasper report with 4 subreports in the detail band of the master report. If data is available for first subreport, it starts displaying from page 1 with the header of the subreport. When second subreport starts printing data, it prints only header in page 1 and the page breaks and in the page 2, it prints header again n data for that. I dont want the header of the second subreport to be printed in page1. It should start print in the page 2. How to solve this..? How could i page break...? its urgent.Please help. Thanks in advance, Prasanna

    Read the article

  • Double associative array or indexed + associative array

    - by clover
    I'm undecided what's the best-practice approach for what I'm trying to do. I'm trying to enter data into an array where the data will look like this: apple color: red price: 2 orange color: orange price: 3 banana color: yellow price: 2 pineapple color: yellow price: 5 When I get input, let's say green apple (notice it's a combo of color + name of fruit), I'm going to check if the name of fruit part exists in the array and display its data (if it exists). What's the right way to compose those arrays? How would I do an indexed array containing an associative array? (or would this be better as 2 nested associative arrays, I'm guessing not)

    Read the article

  • How to embed images in a single HTML / PHP file?

    - by Tatu Ulmanen
    Hi, I am creating a lightweight, single-file database administration tool and I would like to bundle some small icons with it. What is the best way to embed images in a HTML/PHP file? I know a method using PHP where I would call the same file with a GET parameter that would output hardcoded binary data with the correct header, but that seems a bit complicated. Can I use something to pass the image directly in a CSS background-image declaration? This would allow me to utilize the CSS sprite technique. Browser support isn't an issue here, so more exotic methods are welcome also. EDIT Does someone have a link/example to how to generate Data URL's properly with PHP? I'd figure echo 'data:image/png;base64,'.base64_encode(file_get_contents('image.png')) would suffice but I could be wrong.

    Read the article

  • JSON Date coming through as Today's date?

    - by Liam
    I'm trying to convert a JSON date to a dd/mm/yyyy format, which I'm managing to do semi-successfully. The problem I'm encountering is that the date from the record in the DB is for example, 2009-06-29 which is returning the usual JSON /Date(1246230000000)/, however, when I try and turn it into the previously mention dd/mm/yyyy format, it's coming through as today's date. The code I'm using to try and do this is: $('input#EmployeeName').result(function(event, data, formatted) { $('#StartDate').html(formatJSONDate(Date(!data ? '' : data.StartDate))); }); function formatJSONDate(jsonDate) { var newDate = dateFormat(jsonDate, "dd/mm/yyyy"); return newDate; } I'm using JavaScript Date Format to try and run the function. Any help is greatly appreciated.

    Read the article

  • Resize Ext.form.ComboBox to fit its content

    - by ITRushn
    There are quite few solutions on Ext forums, but I wasn’t able to get any of them work. It seems I am missing something minor. I need to resize combobox to fit its content when it’s first created. I do not need to worry about resizing it when content is changing. Is there any working examples using Extjs 3.2? Current Code: var store = new Ext.data.ArrayStore({ fields: ['view', 'value', 'defaultselect'], data: Ext.configdata.dataCP }); comboCPU = new Ext.form.ComboBox({ tpl: '<tpl for="."><div class="x-combo-list-item"><b>{view}</b><br /></div></tpl>', store: store, displayField: 'view', width: 600, typeAhead: true, forceSelection: true, mode: 'local', triggerAction: 'all', editable: false, emptyText: 'empty text', selectOnFocus: true, listeners: { select: AdjustPrice, change: AdjustPrice, beforeselect: function (combo, record, index) { return ('false' == record.data.disabled); } }, applyTo: 'confelement' }); I've also tried removing width: 600 and replacing it with minListWidth: 600 but that result following and didnt fix the issue.

    Read the article

  • apostrophe issue with ajax post to php

    - by Ahmet vardar
    hi there, i am posting data with jquery ajax to php but if input has ' inside, data wont be posted. I tried encodeURIComponent but wont work. any idea on this ? thanks EDIT: my code var name = $("input#name_add").val(); name = encodeURIComponent(name); $.post("function.php", { name: name }, function(data) { //codes }); $query = "UPDATE `table` SET name = '" . stripslashes($_POST['name']) . "' WHERE ID = '$id'"; $result = mysql_query($query); if ($result){ print "ok"; }

    Read the article

  • How to write program to do file transfer based on based omniORBpy

    - by cofthew7
    I'm now writing a Corba project to do file transfering between client and server. But I face trouble when I want to upload file from the client to the server. The IDL I defined is: interface SecretMessage { string send_file(in string file_name, in string file_obj); }; And I implemented the uploading function in the client code: f = open('SB.docx', 'rb') data = '' for piece in read_in_chunks(f): data += piece result = mo.send_file('2.docx', data) If the file is a plain txt file, there is no problem. But if the file is a, like jpg, doc, or others except txt, then it does work. It gives me the error: omniORB.CORBA.BAD_PARAM: CORBA.BAD_PARAM(omniORB.BAD_PARAM_WrongPythonType, CORBA.COMPLETED_NO) Where is the problem?

    Read the article

  • I need free index/fund/stock end of day quotes in CSV

    - by Janne Mikkola
    Hello, I need (free or cheap) source for end of day stock/mutual funds/index values. Major world indexes & European stocks are primary intrest. I keep seeing that yahoo/ google/ MS offer this data, yet I cant find HOWTO doc (or similar) on getting the data. Reuters is an option - ~$300/month puts it out of my range. Sample of what I am looking for: WMX.IDX,20100326,54.49,54.6,54.17,54.41,0 XAH.IDX,20100326,52.39,52.77,52.33,52.54,0 XAL.IDX,20100326,37.34,38.4,37.34,37.59,0 XAO.IDX,20100326,4896.2998,4905.2002,4848.2998,4905.2002,0 I wish to get this data into txt file in an automated manner. My platform is Linux, (I also have pc with windows & emulator in for win in linux so windows is an option) http://www.eoddata.com/ is best site I have found so far. This is quite good yet I desire more info on european finances. Please advice! Sincerely, Janne

    Read the article

  • Alright to truncate database tables when also using Hibernate?

    - by Marcus
    Is it OK to truncate tables while at the same time using Hibernate to insert data? We parse a big XML file with many relationships into Hibernate POJO's and persist to the DB. We are now planning on purging existing data at certain points in time by truncating the tables. Is this OK? It seems to work fine. We don't use Hibernate's second level cache. One thing I did notice, which is fine, is that when inserting we generate primary keys using Hibernate's @GeneratedValue where Hibernate just uses a key value one greater than the highest value in the table - and even though we are truncating the tables, Hibernate remembers the prior value and uses prior value + 1 as opposed to starting over at 1. This is fine, just unexpected. Note that the reason we do truncate as opposed to calling delete() on the Hibernate POJO's is for speed. We have gazillions of rows of data, and truncate is just so much faster.

    Read the article

  • How to fake Azure Table Storage in .NET for Unit Testing?

    - by Erick T
    I am working on a system that uses Azure Table Storage. In other systems (e.g., SQL, File based, etc), I can write a fake that allows me to test my data persistence logic. However, I can't see an easy way to create a fake for the Azure Table Service. I could create a new IIS project that behaves the same way, but that isn't a good way to write a unit test, it is more of an integration test. Any thoughts on how to unit test data access code that uses the Azure Table Storage client? Thanks, Erick

    Read the article

  • IMB_ibImageFromMemory: unknown fileformat?

    - by Antoni4040
    Here's my add-on: import bpy import os import sys import subprocess import threading class ExportToGIMP(bpy.types.Operator): bl_idname = "uv.exporttogimp" bl_label = "Export to GIMP" def execute(self, context): self.filepath = os.path.join(os.path.dirname(bpy.data.filepath), "Layout") bpy.ops.uv.export_layout(filepath=self.filepath, check_existing=True, export_all=False, modified=False, mode='PNG', size=(1024, 1024), opacity=0.25, tessellated=False) self.files = os.path.dirname(bpy.data.filepath) cmd = " (python-fu-bgsync RUN-NONINTERACTIVE)" subprocess.Popen(['gimp', '-b', cmd]) self.update() return {'FINISHED'}; def update(self): self.thread = threading.Timer(3.0, self.update).start() self.filepath2 = "/home/antoni4040/????afa/Layout1.png" bpy.ops.image.open(filepath=self.filepath2, filter_blender=False, filter_image=True, filter_movie=False, filter_python=False, filter_font=False, filter_sound=False, filter_text=False, filter_btx=False, filter_collada=False, filter_folder=True, filemode=9, relative_path=False) tex = bpy.data.textures.new(name = self.filepath2, type = "IMAGE") def exporttogimp_menu(self, context): self.layout.operator(ExportToGIMP.bl_idname, text="Export To GIMP") bpy.utils.register_class(ExportToGIMP) bpy.types.IMAGE_MT_uvs.append(exporttogimp_menu) But I can't load an image, because I get this: Reached EOF while decoding PNG IMB_ibImageFromMemory: unknown fileformat (/home/antoni4040/????afa/Layout1.png) What is that?

    Read the article

  • does log4net AdoNetAppender support sql server 2008?

    - by schrodinger's code
    my config file below: very strange, i have spent a day to find out where i am wrong, but still not working, it still not log anything in the database,but i can output them using RollingFileAppender. Also, the store procedure WriteLog is working well.(I have tested it using sql server studio). I have tried to change the connectionType but not working. Unfortunately I dont have sql server 2000/2005 to test, my log4net version should be the latest one: log4net 1.2.10. Any help is appreciated. <?xml version="1.0" encoding="utf-8"?> <configuration> <configSections> <section name="log4net" type="log4net.Config.Log4NetConfigurationSectionHandler, log4net" /> </configSections> <log4net> <appender name="AdoNetAppender_SqlServer" type="log4net.Appender.AdoNetAppender"> <!--<threshold value="OFF" />--> <bufferSize value="1" /> <connectionType value="System.Data.SqlClient.SqlConnection, System.Data, System.Data, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" /> <!--<connectionType value="System.Data.SqlClient.SqlConnection, System.Data, Version=1.0.3300.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" />--> <connectionString value="Data Source=.\MSSQLSERVER2008,2222;Initial Catalog=UnleashedSaaS;User ID=sa;Password=dogblack;" /> <commandType value="StoredProcedure" /> <commandText value="WriteLog" /> <parameter> <parameterName value="@log_date" /> <dbType value="DateTime" /> <layout type="log4net.Layout.PatternLayout" value="%date{yyyy'-'MM'-'dd HH':'mm':'ss'.'fff}" /> </parameter> <parameter> <parameterName value="@thread" /> <dbType value="String" /> <size value="255" /> <layout type="log4net.Layout.PatternLayout" value="%thread" /> </parameter> <parameter> <parameterName value="@log_level" /> <dbType value="String" /> <size value="50" /> <layout type="log4net.Layout.PatternLayout" value="%level" /> </parameter> <parameter> <parameterName value="@logger" /> <dbType value="String" /> <size value="255" /> <layout type="log4net.Layout.PatternLayout" value="%logger" /> </parameter> <parameter> <parameterName value="@message" /> <dbType value="String" /> <size value="4000" /> <layout type="log4net.Layout.PatternLayout" value="%message" /> </parameter> <parameter> <parameterName value="@exception" /> <dbType value="String" /> <size value="4000" /> <layout type="log4net.Layout.ExceptionLayout" /> </parameter> </appender> <appender name="RollingLogFileAppender" type="log4net.Appender.RollingFileAppender" > <!--<threshold value="OFF" />--> <file value="LogData\\" /> <appendToFile value="true" /> <datePattern value="ul_yyyy-MM-dd.LOG" /> <maxSizeRollBackups value="10" /> <rollingStyle value="Date" /> <maximumFileSize value="2MB" /> <staticLogFileName value="false" /> <layout type="log4net.Layout.PatternLayout"> <param name="ConversionPattern" value="%d{yyyy-MM-dd HH:mm:ss} %p %u %c %l %m %n%n%n" /> </layout> </appender> <root> <level value="ALL"/> <appender-ref ref="AdoNetAppender_SqlServer" /> <appender-ref ref="RollingLogFileAppender" /> </root> </log4net> </configuration>

    Read the article

  • how does NTFS actually work with B-tree ?

    - by bakra
    To improve performance, NTFS directories use a special data management structure called a B-tree. "B-tree" concept here refers to a "tree of storage units" that hold the contents of an individual directory. What I don't understand is where on the disk is this tree stored? Its surely not created every-time we reboot...that would take lots of time. and since its a tree(dynamic Data structure) unlike arrays it will grow. so space needs to be allocated every-time it grows. so how is this "dynamic meta-data" stored ?

    Read the article

  • problem with asf writer

    - by hatham
    Im trying to encode raw data(both video frame and audio sample) into .asf file, using asf writer filter in directshow. my filter graph structure: raw_send_filter - asf writer filter raw_send_filter implements CBaseFilter and CBaseOutputPin. It plays a role as source filter which get raw data, then deliver them to ASF writer filter. The process follows these steps: Get deliver buffer (return into "sample") , using the function CBaseOutputPin::GetDeliveryBuffer sample-GetPointer(&buffer); Set time stamp (with frame rate = 30 fps) deliver sample The problem is after encode some raw data, I can not deliver any more. I can encode .avi file with this way, using Avi mux filter. Can u tell me why I can not deliver samples after encoding some? Thanks.

    Read the article

  • Doing ajax with Jquery.

    - by user272899
    I have been loading content with ajax and all works fine. Here is my code $(document).ready(function() { //Load a-z.php //Timestamp resolves IE caching issue var tsTimeStamp= new Date().getTime(); $.post('../../includes/categories/a-z.php', {action: "post", time: tsTimeStamp}, function(data){ $('#moviescontainer').html(data).slideDown('slow'); }); return true; }); My data inside a-z.php requires Javascript for it's content and when I load a-z.php onto my page the javascript doesn't work. I am quessing that I need to link the relevant files to a-z.php and then load it via ajax. Doesn't this kind of defeat the object of ajax?? That means that I will be loading the js files on the main page and then loading them again when I ajax a-z.php I hope I made some sense.

    Read the article

  • Qt drag & drop button; drop not detecting

    - by Thomas Verbeke
    I'm creating a 2D game in QT and i'm trying to implement a drag & drop into my program. For some reason the drop is not registered: qDebug should print a message on dropping but this doesn't happen. #include "dialog.h" #include "ui_dialog.h" #include "world.h" #include <vector> Dialog::Dialog(QWidget *parent) : QDialog(parent), ui(new Ui::Dialog) { ui->setupUi(this); scene = new QGraphicsScene(this); ui->graphicsView->setScene(scene); MySquare *item; QGraphicsRectItem *enemyItem; World *myWorld = new World(); std::vector<Tile*> tiles = myWorld->createWorld(":/texture.jpg"); int count = 0; foreach (Tile *tile, tiles){ count++; item = new MySquare(tile->getXPos()*4,tile->getYPos()*4,4,4); item->setBrush(QColor(tile->getValue()*255,tile->getValue()*255,tile->getValue()*255)); item->setAcceptDrops(true); scene->addItem(item); } player = new MySquare(10,20,10,10); player->setAcceptDrops(true); scene->addItem(player); //drag & drop part QPushButton *pushButton = new QPushButton("Click Me",this); connect(pushButton,SIGNAL(pressed()),this,SLOT(makeDrag())); setAcceptDrops(true); } void Dialog::makeDrag() { QDrag *dr = new QDrag(this); // The data to be transferred by the drag and drop operation is contained in a QMimeData object QMimeData *data = new QMimeData; data->setText("This is a test"); // Assign ownership of the QMimeData object to the QDrag object. dr->setMimeData(data); // Start the drag and drop operation dr->start(); } mysquare.cpp #include "mysquare.h" MySquare::MySquare(int _x,int _y, int _w, int _h) { isPlayer=false; Pressed=false; setFlag(ItemIsMovable); setFlag(ItemIsFocusable); setAcceptDrops(true); color=Qt::red; color_pressed = Qt::green; x = _x; y = _y; w = _w; h = _h; } QRectF MySquare::boundingRect() const { return QRectF(x,y,w,h); } void MySquare::paint(QPainter *painter, const QStyleOptionGraphicsItem *option, QWidget *widget) { QRectF rec = boundingRect(); QBrush brush(color); if (Pressed){ brush.setColor(color); } else { brush.setColor(color_pressed); } painter->fillRect(rec,brush); painter->drawRect(rec); } void MySquare::mousePressEvent(QGraphicsSceneMouseEvent *event) { Pressed=true; update(); QGraphicsItem::mousePressEvent(event); qDebug() << "mouse Pressed"; } void MySquare::mouseReleaseEvent(QGraphicsSceneMouseEvent *event) { Pressed=false; update(); QGraphicsItem::mousePressEvent(event); qDebug() << "mouse Released"; } void MySquare::keyPressEvent(QKeyEvent *event){ int x = pos().x(); int y = pos().y(); //key handling QGraphicsItem::keyPressEvent(event); } void MySquare::dropEvent(QDropEvent *event) { qDebug("dropEvent - square"); // Unpack dropped data and handle it the way you want qDebug("Contents: %s", event->mimeData()->text().toLatin1().data()); } void MySquare::dragMoveEvent(QDragMoveEvent *event){ qDebug("dragMoveEvent - square "); event->accept(); } void MySquare::dragEnterEvent(QDragEnterEvent *event){ event->setAccepted(true); qDebug("dragEnterEvent - square"); event->acceptProposedAction(); } void MySquare::setBrush(QColor _color){ color = _color; color_pressed = _color; update(); //repaint } edit; there is no problem with qDebug() i'm just using it to test them i'm inside the drag events..which i'm not

    Read the article

  • Database Replication OOD Pattern

    - by MrOnigiri
    Greetings fellow overflowers, After reading on MSDN about correct strategies on how to perform database replication, and understanding their suggestion on Master-Subordinate Incremental Replication. It left me wondering, what OOD design pattern should I use on this... The main elements of this strategy are the Acquirer, the Manipulator and the Writer. The first fetches data from the database and passes on to the second which might perform simple transformations to the data, before handling it to the final element, the writer, that writes the desired data on the destination Database. I thought about using the Chain of Responsibility pattern, but the Acquirer, Manipulator and Writer don't share a common role among theme, so It makes no sense. Should these elements be written as separate classes, or methods inside my service? Of course I'll be creating a DB Helper class as well, but that doesn't constitutes a problem. Wondering what your opinions on this are! Thanks for your replies

    Read the article

  • Where should I exclude and select information BL or DL?

    - by MRFerocius
    Hi guys; I have another conceptual question. Suppose I have a Data Layer and a Bussines Layer. I have on my data base for example Customers and those customers has an assigned Vendor: Customers(customerID, customerName, customerAddress, vendorID) Vendors(vendorID, vendorName, vendorAddress) Now suppose my Vendor logs into my web application and wants to see all his customers: a) Should I use my Datalayer method and there find his customers on the query? b) Should the data layer return all the customers and on the Buissnes Layer filter that vendor ones? Is B even a good approach because is the one I want to use.... Is it correct? Thanks in advance!!!

    Read the article

  • break dataframe into subsets by factor values, send to function that returns glm class, how to recom

    - by Alex Holcombe
    Thanks to Hadley's plyr package ddply function we can take a dataframe, break it down into subdataframes by factors, send each to a function, and then combine the function results for each subdataframe into a new dataframe. But what if the function returns an object of a class like glm or in my case, a c("glm", "lm"). Then, these can't be combined into a dataframe can they? I get this error instead Error in as.data.frame.default(x[[i]], optional = TRUE, stringsAsFactors = stringsAsFactors) : cannot coerce class 'c("glm", "lm")' into a data.frame Is there some more flexible data structure that will accommodate all the complex glm class results of my function calls, preserving the information regarding the dataframe subsets? Or should this be done in an entirely different way?

    Read the article

  • Can a http server detect that a client has cancelled their request?

    - by Nick Retallack
    My web app must process and serve a lot of data to display certain pages. Sometimes, the user closes or refreshes a page while the server is still busy processing it. This means the server will continue to process data for several minutes only to send it to a client who is no longer listening. Is it possible to detect that the connection has been broken, and react to it? In this particular project, we're using Django and NginX, or Apache. I assumed this is possible because the Django development server appears to react to cancelled requests by printing Broken Pipe exceptions. I'd love to have it raise an exception that my application code could catch. Alternatively, I could register an unload event handler on the page in question, have it do a synchronous XHR requesting that the previous request from this user be cancelled, and do some kind of inter-process communication to make it so. Perhaps if the slower data processing were handed to another process that I could more easily identify and kill, without killing the responding process...

    Read the article

  • Using LINQ to search a byte array for all subarrays that start/stop with certain byte

    - by Joel B
    I'm dealing with a COM port application and we have a defined variable-length packet structure that I'm talking to a micro-controller with. The packet has delimiters for the start and stop bytes. The trouble is that sometimes the read buffer can contain extraneous characters. It seems like I'll always get the whole packet, just some extra chatter before/after the actual data. So I have a buffer that I append data to whenever new data is received from the COM port. What is the best way to search this buffer for any possible occurrences of my packet? For example: Say my packet delimiter is 0xFF and I have an array as such { 0x00, 0xFF, 0x02, 0xDA, 0xFF, 0x55, 0xFF, 0x04 } How can I create a function/LINQ-statment that returns all subarrays that start and end with the delimiter (almost like a sliding-correlator with wildcards)? The sample would return the following 3 arrays: {0xFF, 0x02, 0xDA, 0xFF}, {0xFF, 0x55, 0xFF}, and {0xFF, 0x02, 0xDA, 0xFF, 0x55, 0xFF}

    Read the article

  • Huge file in Clojure and Java heap space error

    - by trzewiczek
    I posted before on a huge XML file - it's a 287GB XML with Wikipedia dump I want ot put into CSV file (revisions authors and timestamps). I managed to do that till some point. Before I got the StackOverflow Error, but now after solving the first problem I get: java.lang.OutOfMemoryError: Java heap space error. My code (partly taken from Justin Kramer answer) looks like that: (defn process-pages [page] (let [title (article-title page) revisions (filter #(= :revision (:tag %)) (:content page))] (for [revision revisions] (let [user (revision-user revision) time (revision-timestamp revision)] (spit "files/data.csv" (str "\"" time "\";\"" user "\";\"" title "\"\n" ) :append true))))) (defn open-file [file-name] (let [rdr (BufferedReader. (FileReader. file-name))] (->> (:content (data.xml/parse rdr :coalescing false)) (filter #(= :page (:tag %))) (map process-pages)))) I don't show article-title, revision-user and revision-title functions, because they just simply take data from a specific place in the page or revision hash. Anyone could help me with this - I'm really new in Clojure and don't get the problem.

    Read the article

  • How to import text file to table with primary key as auto-increment

    - by webworm
    I have some bulk data in a text file that I need to import into a MySQL table. The table consists of two fields .. ID (integer with auto-increment) Name (varchar) The text file is a large collection of names with one name per line ... (example) John Doe Alex Smith Bob Denver I know how to import a text file via phpMyAdmin however, as far as I understand, I need to import data that has the same number of fields as the target table. Is there a way to import the data from my text file into one field and have the ID field auto-increment automatically? Thank you in advance for any help.

    Read the article

  • How to update the contents of a FigureCanvasTkAgg

    - by Copo
    I'm plotting some data in a Tkinter FigureCanvasTkagg using matplotlib. I need to clear the figure where i plot data and draw new data when a button is pressed. here is the plotting part of the code (there's an App class defined before..) self.fig = figure() self.ax = self.fig.add_subplot(111) self.ax.set_ylim( min(y), max(y) ) self.line, = self.ax.semilogx(x,y,'.-') #tuple of a single element self.canvas = FigureCanvasTkAgg(self.fig,master=master) self.ax.semilogx(x,y,'o-') self.canvas.show() self.canvas.get_tk_widget().pack(side='top', fill='both', expand=1) self.frame.pack() how do i update the contents of such a canvas? regards, Jacopo

    Read the article

  • Packing a file into an ELF executable

    - by Pierre Bourdon
    Hello, I'm currently looking for a way to add data to an already compiled ELF executable, i.e. embedding a file into the executable without recompiling it. I could easily do that by using cat myexe mydata > myexe_with_mydata, but I couldn't access the data from the executable because I don't know the size of the original executable. Does anyone have an idea of how I could implement this ? I thought of adding a section to the executable or using a special marker (0xBADBEEFC0FFEE for example) to detect the beginning of the data in the executable, but I do not know if there is a more beautiful way to do it. Thanks in advance.

    Read the article

< Previous Page | 812 813 814 815 816 817 818 819 820 821 822 823  | Next Page >