Search Results

Search found 22951 results on 919 pages for 'debug build'.

Page 581/919 | < Previous Page | 577 578 579 580 581 582 583 584 585 586 587 588  | Next Page >

  • Is there a way to automate testing for a GNS3 network topology?

    - by Chedy2149
    I'm working on an MPLS backbone topology using GNS3, this topology includes configuration of several networking technologies such as IPv6, BGP, OSPF, IPv6 to IPv4 dynamic tunneling, MPLS based VPNs, MPLS based traffic engineering... So I'm a beginner with basic knowledge in networking (Ipv4 addressing, basic routing and switching) and I can't just easly test if my topology works or not, moreover fiddling with configuration I've noticed that sometimes when getting MPLS config right and missing BGP config all my topology is messed up. Given that I use Cisco 7200 series routers, is there any way to build a test harness for my topology in order to guarentee correctness with a single push button automated process?

    Read the article

  • share distribution question

    - by facebook-100000781341887
    Hi, I just developed a facebook game(mifia like), but the graphic I make is not good, because it is reference with some existing photo, trace with AI, and coloring it. Therefore, I invite my friend to join me, he is a graphic designer, own a company with his friend (I know both of them), for the share, I expect at least 70% for me, and at most 30% for them (both of them want to join). Therefore, they give me a counter offer, 60% for me and 40% for them, of course, I feel their counter offer is unacceptable because they only build the image in part time, and all the other work just like coding, webhosting...etc, is what I do in full time. Why they said they worth 40% is that they will make a good graphic, they can provide a advertise channel(on local magazine), etc... Actually, I don't think the game need advertisement on local magazine because the game is not target for local... Please give me some comments on this issue(is the share fair? what is the importance of the image of the game, is it worth more than 30%), or can anyone share the experience on this. Thanks in advance.

    Read the article

  • FTP server (vsftpd) with webgui

    - by manutenfruits
    I want to build a file server to make users able to upload and download mostly multimedia, but also common files. Right now I have an Arch installation with vsftpd and I'm about to install miniDLNA for multimedia sharing. The only problem is that FTP doesn't seem to fit my needs, because almost always makes the users need a client such as FileZilla to make the server friendly. I have been looking for a web frontend for vsftp but apart from management interfaces there's nothing. I need a frontend accessible from a browser through which users can navigate throught the folders in an easier and more elegant way than the plain FTP display that browsers make by default. It should be able to let users upload files and, as an awesome extra, let them play the multimedia directly on the browser. For this, I am willing to dump FTP if needed, I've heard about HTTP File Servers but don't know too much about it. I could code everything myself, but there's gotta be something out there already.

    Read the article

  • how to customize debian installation cd for your needs

    - by Frank
    I have with me a Debian CD, which I want to customize for my own needs. I have extracted the CD and started to change some parts of it, e.g Splash screen (splash.png) installer Title (through isolinux.cfg) etc These are the things that I want to do: Change the Splash logo at start up of installation to have my own (which is done) Change the grub boot parameters to use my comapny name on it. Change the set of packages in it, so that I can have my own set of packages in it and only those packages are installed Do some post installation steps Customize it's startup and login screen to have my company name. After I am done with this customization, I need to build its live installer CD so that I can install it on my own, on any other system.

    Read the article

  • Tar dereference only 1 level

    - by Bart van Heukelom
    I use the following pseudo-script to create a TAR of my installed software mkdir tmp ln -s /path/to/app1/bin tmp/app1 ln -s /and/path/going/to/the-app-2 tmp/app2 tar -c --dereference -f apps.tar tmp I need the --dereference option here to follow the links I just made in tmp. The reason I make the links in the first place is to store the directories with a different name in the archive than they have on the filesystem. Until now it has worked fine. However, I now have the situation that /path/to/app1 also contains links, and those I don't want to follow. Is this possible with some changes to the tar command? Or do I need to completely switch around the way I build the archive?

    Read the article

  • Installing packages into local directory?

    - by Gili
    I'd like to install software packages, similar to apt-get install <foo> but: Without sudo, and Into a local directory The purpose of this exercise is to isolate independent builds in my continuous integration server. I don't mind compiling from source, if that's what it takes, but obviously I'd prefer the simplest approach possible. I tried apt-get source --compile <foo> as mentioned here but I can't get it working for packages like autoconf. I get the following error: dpkg-checkbuilddeps: Unmet build dependencies: help2man I've got help2man compiled in a local directory, but I don't know how to inform apt-get of that. Any ideas? UPDATE: I found an answer that almost works at http://askubuntu.com/a/350/23678. The problem with chroot is that it requires sudo. The problem with apt-get source is that I don't know how to resolve dependencies. I must say, chroot looks very appealing. Is there an equivalent command that doesn't require sudo?

    Read the article

  • Skip CodedUI Tests, use Selenium for Web Automation

    - by Aligned
    Originally posted on: http://geekswithblogs.net/Aligned/archive/2013/10/31/skip-codedui-tests-use-selenium-for-web-automation.aspxI recently joined a team that was using Agile Methodologies to create a new product. They have a working beta product after 10 or so 2 week sprints and already had UI’s that had changed several times as they went through iterations of their UI. As a result, the QA team was falling behind with automated tests and I was tasked to help them catch up and expand their tests. The project is a website. I heard many complaints about how hard it is to work with CodedUI (writing our own code, not relying on the recorder as we wanted re-usable and more maintainable code) then it took me 4+ hours to fix one issue. It was hard to traverse the key and debugging the objects with breakpoints… I said out loud “there has to be a better way or a framework the uses jQuery to run through the tests.” Plus it seemed really slow (wait… finding the object … wait… start putting in text…). Plus some tests would randomly fail on the test agents (using the test settings and an automated build, they are run on VMs using Microsoft test agents). Enough complaining. Selenium to the rescue (mostly). The lead QA guy decided to try it out and we haven’t turned back. We are now running tests in Chrome and Firefox and they run a lot faster. We had IE running to, but some of the tests were running fine locally, but hanging on the test agents. I’ll add some hints and lessons learned in a later post.

    Read the article

  • Partner Webcast – Oracle Weblogic 12c for New Projects - 07 Nov 2013

    - by Thanos Terentes Printzios
    Fast-growing organizations need to stay agile in the face of changing customer, business or market requirements. Oracle WebLogic Server 12c is the industry's best application server platform that allows you to quickly develop and deploy reliable, secure, scalable and manageable enterprise Java EE applications.WebLogic Server Java EE applications are based on standardized, modular components. WebLogic Server provides a complete set of services for those modules and handles many details of application behavior automatically, without requiring programming. New project applications are created by Java programmers, Web designers, and application assemblers. Programmers and designers create modules that implement the business and presentation logic for the application. Application assemblers assemble the modules into applications that are ready to deploy on WebLogic Server. Build and run high-performance enterprise applications and services with Oracle WebLogic Server 12c, available in three editions to meet the needs of traditional and cloud IT environments. Join us, in this webcast, as we will show you how WebLogic Server 12c helps you building and deployingenterprise Java EE applications with support for new features for lowering cost of operations, improving performance, enhancing scalability. Agenda Oracle WebLogic Server Introduction Application Development on WebLogic Using Java EE Overview of the Application Deployment Process Monitoring Application Performance Q&A November 07th, 2013 -  9am UTC/11am EET Delivery FormatThis FREE online LIVE eSeminar will be delivered over the Web. Registrations received less than 24hours prior to start time may not receive confirmation to attend. Duration: 1 hour REGISTER NOW For any questions please contact us at partner.imc-AT-beehiveonline.oracle-DOT-com Stay Connected Oracle Newsletters

    Read the article

  • JCP 2012 Award Nominations are now open!

    - by heathervc
    The 10th JCP Annual Awards Nominations are now open until 16 July 2012. Submit nominations to [email protected] or use form here. The Java Community Process (JCP) program celebrates success. Members of the community nominate worthy participants, Spec Leads, and Java Specification Requests (JSRs) in order to cheer on the hard work and creativity that produces ground-breaking results for the community and industry in the Java Standard Edition (SE), Java Enterprise Edition (EE), or Java Micro Edition (ME) platforms. The community gets together every year at the JavaOne conference to applaud in person the winners of three awards: JCP Member/Participant of the Year, Outstanding Spec Lead, and Most Significant JSR. This year’s unveiling will occur Tuesday evening, 2 October, at the Annual JCP Community Party held in San Francisco.  Nominate today...descriptions of the award categories for this year: JCP Member/Participant Of The Year - This award recognizes the corporate or individual member (either Member or Participant) who has made the most significant positive impact on the community in the past year. Leadership, investment in the community, and innovation are some of the qualities that EC Members look for in voting for this award. Outstanding Spec Lead - The role of Spec Lead is not an easy one, and the person who takes that responsibility must be, among other things, technically savvy, able to build consensus in spite of diverse corporate goals, and focused on efficiency and execution. This award recognizes the person who has brought together these qualities the best in the past year, in leading a JSR for the Java community (Java SE, Java EE or Java ME). Most Significant JSR - Specification development is key to the success of the JCP program and helps ensure we remain a fresh and vibrant community. This award recognizes the Spec Lead and Expert Group that have contributed (either in progress or final) the most significant JSR for the Java community (Java SE, Java EE or Java ME) in the past year.

    Read the article

  • How can I load one image over network to multiple computers on boot?

    - by user754730
    A few years ago I saw this in a company but I don't know how it was built. There was 1 Computer (I don't know if Windows Server or plain Windows 7 - the server) and 3 other computers (Windows 7 - the clients). As soon as the Windows 7 clients were started, they all started up the same image (Don't know if the same image file or just the same state) over network and were able to work on the computer. As soon as the machine was shutdown, all the changes made to the system were erased. How could I build a system like this so I have 1 image file which I keep up to date and then feed it to the other machines in my network? It would look this this basically:

    Read the article

  • Tools to manage large network of heterogeneous web applications?

    - by Andrew
    I recently started a new job where I've been tasked with managing a global network of heterogenous web applications. There's very little documentation. My first order of business is to create an inventory of all of the web applications. Are there any tools out there to manage a large group of web apps? I'd like to collect a large dataset for each website including: logins for web based control panels logins to FTP/ssh accounts Google analytics tracking code for each site 3rd party libraries used SSL certs, issuers, and expiration dates etc I know I could keep the information in Excel or build a custom database, but I'm hoping there's already a tool out there to help me with this.

    Read the article

  • When can I publish a software tool written at work?

    - by AlexMA
    I'm working on a software problem at work that is fairly generic, but I can't find a library I like to solve it, so I'm considering writing one myself (at least a bare-bones version). I'll be writing some if not all of the 1.0 version at work, since I need it for the project. If turns out well I might want to bring the work home and polish it up just for fun, and maybe release it as an open-source project. However, I'm concerned that if I wrote the 1.0 version at work I may not be allowed to do this from a legal sense. Obviously I could ask my boss (who probably won't care), but I'm curious how other programmers have dealt with this issue and where the law stands here. My one sentence question is, When is it okay (legally/ethically) to open-source a software tool originally written by you for work at work? What if you have expanded the original source significantly during off-hours? Follow-up: Suppose I write the whole thing at home on my time then simply use it at work, does that change things drastically? Follow-up 2: Note that I'm not trying to rip off my employer (I understand that they're paying me to build products that they own)--I'm just wondering if there's a fair way of doing this for all involved... It would be nice if some nonprofit down the road could use my code and save them some time. Also, there's another issue at stake. If I write the library for a very simple, generic thing (like HTML tables in Javascript), does that mean I can never again do so on my own time without putting myself at legal risk (even if it was a whole new fresh rewrite or a segment of a larger project). Am I surrendering my right to write code for this sort of project for the rest of my life (without this company's permission), since the code at work might still be somewhere in my brain influencing me? This seems related to software patents, as a side-note.

    Read the article

  • Overriding RPM public key database

    - by pilcrow
    Can rpm be persuaded to import and fetch public keys from an arbitrary pubkey database? On the same build machine I've got two automated users who each need to verify package signatures from different sources, signed under different keys. If I rpm --import pkg-source1.pub pkg-source2.pub, each user will be able to verify packages intended for the other. I'd rather each user not know about the other's public keyring. Is there a way I can specify an alternate or supplementary pubkey database on a per-user or per-rpm(8)-invocation basis?

    Read the article

  • The Talent Behind Customer Experience

    - by Christina McKeon
    Earlier, I wrote about Powerful Data Lessons from the Presidential Election. A key component of the Obama team’s data analysis deserves its own discussion—the people. Recruiters are probably scrambling to find out who those Obama data crunchers are and lure them into corporations. For the Obama team, these data scientists became a secret ingredient that the competition didn’t have. This team of analysts knew how to hear the signal and ignore the noise, how to segment and target its base, and how to model scenarios and revise plans based on what the data told them. The talent was the difference. As you work to transform your organization to be more customer-centric, don’t forget that talent is a critical element. Journey mapping is a good start to understanding how your talent impacts your customer experiences. Part of journey mapping includes documenting the “on-stage” and “back-stage” systems and touchpoints. When mapping this part of your customers’ journey, include the roles and talent behind the employee actions—both customer facing and further upstream from that customer touchpoint. Know what each of these roles does, how well you are retaining people in these areas, and your plans to fill these open positions in the future. To use data scientists as an example, this job will be in high demand over the next 10 years. The workforce is shrinking, and higher education institutions may not be able to turn out trained data scientists as fast as you need them. You don’t want to be caught with a skills deficit, so consider how you can best plan for the future talent you will need. Have your existing employees make their career aspirations known to you now. You may find you already have employees willing to take on roles that drive better customer experiences. Then develop customer experience talent from within your organization through targeted learning programs. If you know that you will need to go outside the organization, build those candidate relationships now. Nurture the candidates you want to hire and partner with universities, colleges, and trade associations so you can increase the number of qualified candidates in your talent pool.

    Read the article

  • Partner Webcast - Oracle Data Integration Competency Center (DICC): A Niche Market for services

    - by Thanos Terentes Printzios
    Market success now depends on data integration speed. This is why we collected all best practices from the most advanced IT leaders, simply to prove that a Data Integration competency center should be the primary new IT team you should establish. This is a niche market with unlimited potential for partners becoming, the much needed, data integration services provider trusted by customers. We would like to elaborate with OPN Partners on the Business Value Assessment and Total Economic Impact of the Data Integration Platform for End Users, while justifying re-organizing your IT services teams. We are happy to share our research on: The Economical impact of data integration platform/competency center. Justifying strongest reasons and differentiators, using numeric analysis and best-practice in customer case studies from specific industries Utilizing diagnostics and health-check analysis in building a business case for your customers What exactly is so special in the technology of Oracle Data Integration Impact of growing data volume and amount of data sources Analysis of usual solutions that are being implemented so far, addressing key challenges and mistakes During this partner webcast we will balance business case centric content with extensive numerical ROI analysis. Join us to find out how to build a unified approach to moving/sharing/integrating data across the enterprise and why this is an important new services opportunity for partners. Agenda: Data Integration Competency Center Oracle Data Integration Solution Overview Services Niche Market For OPN Summary Q&A Delivery Format This FREE online LIVE eSeminar will be delivered over the Web. Registrations received less than 24hours prior to start time may not receive confirmation to attend. Presenter: Milomir Vojvodic, EMEA Senior Business Development Manager for Oracle Data Integration Product Group Date: Thursday, September 4th, 10pm CEST (8am UTC/11am EEST)Duration: 1 hour Register Today For any questions please contact us at [email protected]

    Read the article

  • if exist !SOMEPATH! not working in batch file

    - by akash
    I have a batch script in which i am using multiple if exist statement, the problem is all statements are working except one . Following variables are set SETLOCAL ENABLEDELAYEDEXPANSION SET basedrive=E: SET tfworkspace=!basedrive!\TFS SET envdefault=%1 SET projenv=!envdefault! echo subapp=!subapp! subappservice=!subappservice! SET tfworkspacepath=!tfworkspace!\!releasebranch!\!app!\!subapp! SET tfworkspacepathservice=!tfworkspace!\!releasebranch!\!app!\!subapp!\sourcecode\build\!projenv! This statement works, if exist "!tfworkspacepath!" (robocopy "!tfworkspacepath!"\sourcecode\messagebroker\ /E /NFL /NJS /NDL /ETA "!basedir!\!messagebroker!" ) else SET /a foldererror=1 This statement doesn't work, by does not work i mean even thou the path does not exist it it still tries to robocopy. if exist !tfworkspacepathservice! ( robocopy !tfworkspacepathservice! /E /NFL /NJS /NDL /ETA "!basedir!\!scripts!") else SET /a foldererror =!foldererror!+1 I am new to batch writing, please guide me

    Read the article

  • Problems loading Hilva tutorials

    - by Beska
    I'm a newcomer to XNA, and I'm evaluating some libraries. The Hilva Graphics Engine looks interesting, and I'm trying to run their tutorials. However, all of them give me errors. For example, if I download the ParallaxMappingSample demo, and try to build it, I get Error 1 Error loading pipeline assembly "C:\Users\Me\Desktop\ParallaxMappingSample\Hilva.Content.dll". ParallaxMappingSample I get similar errors for all of the samples. Unfortunately, this error isn't very enlightening. I can see the Hilva.Content.dll in the appropriate directory. I tried removing and readding the reference from the content project, but I get the same error. I'm not sure it's relevant, but I'm on Windows 7, I'm using Microsoft Visual Studio 2010, and XNA 4.0. Is there an easy (or difficult) solution? EDIT: If you happen to try this, even if you don't have a solution, let me know about it in a comment. Whether it works for you, or if you get the same problem...either result would be something that might let me know if it's just a problem with the tutorial, or if it's on my end.

    Read the article

  • Why can't I compile this version of Postfix?

    - by Coofucoo
    I just installed postfix 2.7.11 in Ubuntu server from source code. I do not use the ubuntu own one because I need the old version. I found a very interesting problem. Before, in both CentOS 5 and 6, I can build the source code without any problem. But, in Ubuntu server 12.04 is totally different. I got the following problems: dict_nis.c:173: error: undefined reference to 'yp_match' dict_nis.c:187: error: undefined reference to 'yp_match' dns_lookup.c:347: error: undefined reference to '__dn_expand' dns_lookup.c:218: error: undefined reference to '__res_search' dns_lookup.c:287: error: undefined reference to '__dn_expand' dns_lookup.c:498: error: undefined reference to '__dn_expand' dns_lookup.c:383: error: undefined reference to '__dn_expand' Yes, this reason is obviously. I just search related library and add it to the makefile. It works. The question is why? What is the difference between Ubuntu Server and CentOS? One possibility is gcc and ld version. Ubuntu server use different version of gcc and ld with CentOS. But I am not sure.

    Read the article

  • Learning curve webdevelopment

    - by refro
    At the moment our team has a huge challenge, we're being asked to deliver a new GUI for an embedded controller. De deadline is very tight and is set on april 2013. Our team is very diverse some people are on the level of functional programming (mostly C), others (including myself) also master object oriented programming (C++, C#). We build a prototype with android, although it has its quirks it is mostly just OO. For the future there is a wish to support multiple platforms (Windows, Android, iOS). In my opinion a HTML5 app with a native app shell is the way to go. When gathering more information on the frameworks to use etc it becomes obvious to me a paradigm shift is needed. None of us have a web background so we need to learn from the ground up. The shift from functional to oo took us about 6 months to become productive (and some of the early subsystems were rewritten because they were a total mess) . Can we expect the learning curve to be similar? Can this be pulled off with a webapp? (My feeling says it will already be hard to pull off as a native app which is at the edge of our comfort zone)

    Read the article

  • Tracking "To Do" Items

    - by Bill Graziano
    One of the challenges I struggle with is keeping a good "to do" list of things I need to do on the various SQL Servers I support. I have servers that I don't visit on a regular basis so my situation may be different than many of you. Though I'm sure you all have servers that you only touch every few months. (And it's usually the accounting server!) It's difficult for me to remember what changes I made and what changes I need to make. I've tried Outlook, OneNote and various other to do list managers and haven't been happy with any of them. Many are close but just don't give me what I need. As a result I've started writing my own. It's web-based so you can use it from anywhere -- including on a server. It also knows just enough about SQL Server to help structure your to do items and your notes. It isn't agent based and doesn't do any monitoring. Think OneNote or Evernote but with some "SQL Servery" stuff built in. If you'd like to try this or take a survey I'm putting together, add your email address to my mailing list.  I should be ready in a week or so.  I'm only going to use this list for notifications about this service. I'd like to find a small group of people that feel the same pain I do and maybe we can build something interesting.

    Read the article

  • What would be the best way to correlate logs and events on several hosts?

    - by user220746
    I'm trying to build a log correlation system on multiple hosts. SEC seems interesting but I don't know if it will cover my needs. How could I correlate system events, logs, network events, etc. on multiple hosts at the same time, in real time? Examples: If 5 failed logins happened on host A the last minute and if firewall B has denied lots of access on differents ports on A, then we assume there is a potential attack in progress on A. If the Apache service on host A didn't receive any request for the last N minutes and Apache service on host B did, then the load balancing could be faulty.

    Read the article

  • Ubuntu+Mono+Postgres+ASP.NET 4.0. No problem?

    - by wreck_of_u
    Would this be ok? I'm an ASP.NET developer and I'm planning to build "portable" web app servers based on Atom D510 mini-ITX. I have ran Ubuntu 10 with MySQL along with a separate IIS machines (win 2k3, 2k8) before with no problems. But now I'm thinking of "packaging" a web/db server into one small, cheap machine. I thought of Ubuntu/Mono/Postgres/ASP.NET, that it would be a good idea but I'm not sure? I have not actually tried it yet. Your thoughts?

    Read the article

  • What is the cheapest non-colocation way to serve about 10 static files at a rate of 100 megabits per

    - by Mark Maunder
    I've looked at Amazon S3 and it costs roughly $4746 per month for 100 megabits/s (which translates into 31,640 Gigabytes of data transferred. That's at a rate of $0.15 per gig.) I haven't found a cheaper "cloud" option. I'm curious if there's any other cloud hosting option out there cheaper than S3. Uptime is not an issue because I can build failover for most things into the browser. e.g. I can use javascript to say "if the image didn't load then go to this other URL instead." FYI I'm currently using a colocation facility which is about 30% cheaper than S3 and I'm familiar with colo prices - so this question is really about "cloud" services and by that I mean services where I don't have to worry about the infrastructure.

    Read the article

  • Part 4: Development Standards or How to share

    - by volker.eckardt(at)oracle.com
    Although we usually introduce the custom development part in EBS projects as “a small piece only” and “we will avoid as best as possible”, the development effort can be enormous and should therefore be well addressed by project standards. Any additional solution or additional software tool or product shall influence the custom development rules (by adding, removing or replacing sections). It is very common in EBS projects to create a so called “MD.030 Development Standards” document and put everything what’s related to development conventions into it. This document gets approval and will be shared among all developers. Later, additional sections have to be added, and usually the development lead is responsible for doing this. However, sometimes used development techniques are not documented properly, and therefore the development solutions deviate from each other, or from the initially agreed standards. My advice would be the following: keep the MD.030 as a base document, and add a Wiki on top. The “Development Wiki” covers the following: Collect input from every developer without updating the MD.030 directly Collect additional topics that might need further specification Allow a discussion about such topics by reviewing/updating the wiki directly Add also decisions or open questions right into it. In one of my own projects we were using this “Developer Wiki” quite extensive, and my experience is very positive. We had different sections in it, good cross references, but also additional material like code templates, links to external web pages etc. By using this wiki, the development standards became “owned” by the right group of people, the developers. They recognized that information sharing can improve the overall development quality, but will also reduce the workload on individuals. Finally, the wiki was much more accurate and helpful for the daily development work than our initial MD.030, and we all decided to retire the document completely. Summary: Information sharing in the development area is very important! The usual “MD.030 Development Standards“ is a good starting point, but should be combined with a “Development Wiki”, allowing everyone to address and discuss necessary improvements. A well-structured Wiki can replace the document in some sections completely. Side Note: The corresponding task in Oracle OUM (Oracle Unified Method) is DS.050 ‘Determine Design and Build Standards’ Volker

    Read the article

  • Why not commit unresolved changes?

    - by Explosion Pills
    In a traditional VCS, I can understand why you would not commit unresolved files because you could break the build. However, I don't understand why you shouldn't commit unresolved files in a DVCS (some of them will actually prevent you from committing the files). Instead, I think that your repository should be locked from pushing and pulling, but not committing. Being able to commit during the merging process has several advantages (as I see it): The actual merge changes are in history. If the merge was very large, you could make periodic commits. If you made a mistake, it would be much easier to roll back (without having to redo the entire merge). The files could remain flagged as unresolved until they were marked as resolved. This would prevent pushing/pulling. You could also potentially have a set of changesets act as the merge instead of just a single one. This would allow you to still use tools such as git rerere. So why is committing with unresolved files frowned upon/prevented? Is there any reason other than tradition?

    Read the article

< Previous Page | 577 578 579 580 581 582 583 584 585 586 587 588  | Next Page >