Search Results

Search found 762 results on 31 pages for 'haskell'.

Page 25/31 | < Previous Page | 21 22 23 24 25 26 27 28 29 30 31  | Next Page >

  • How much sense does it make for a veteran .Net developer to move to ROR professionally?

    - by SharePoint Newbie
    Hi, I consider myself a moderately skilled (definitely not stupid) .Net developer. Over the past 5 years I've been working with ASP.Net, ASP.Net MVC, SharePoint, WPF, Silverlight, RDBMS (SQL Server and Oracle). I maintain/contribute a couple of .Net OSS. I've also picked up F# and Haskell over the previous year. I am currently employed at one of the better (best) software firms out there and would surely love to continue working here. However over the past 6 months opportunities in .Net have mostly dried up and all new work is headed towards ROR (and whatever is left towards Java). I have never been apprehensive about learning a new stack/language for fun and have previously picked up Haskell and Python in my free time. I am however apprehensive as to what impact moving to a new entirely different stack would have on my career. What would you do: Change jobs if you don't find anything on .Net soon. Try out the ROR stack for some time. If you find that its not your cup of tea, move back. (How would this impact my career and job opportunities in the longer run?) Also it would be very helpful if there are any ASP.Net MVC folks who have switched over to ROR professionally who can share their experiences. Edit: I have not done any development on a *nix box before. I've however used Ubuntu for fun and games. Sorry if this sounds subjective.

    Read the article

  • List of freely available programming books

    - by Karan Bhangui
    I'm trying to amass a list of programming books with opensource licenses, like Creative Commons, GPL, etc. The books can be about a particular programming language or about computers in general. Hoping you guys could help: Languages BASH Advanced Bash-Scripting Guide (An in-depth exploration of the art of shell scripting) C The C book C++ Thinking in C++ C++ Annotations How to Think Like a Computer Scientist C# .NET Book Zero: What the C or C++ Programmer Needs to Know About C# and the .NET Framework Illustrated C# 2008 (Dead Link) Data Structures and Algorithms with Object-Oriented Design Patterns in C# Threading in C# Common Lisp Practical Common Lisp On Lisp Java Thinking in Java How to Think Like a Computer Scientist Java Thin-Client Programming JavaScript Eloquent JavaScript Haskell Real world Haskell Learn You a Haskell for Great Good! Objective-C The Objective-C Programming Language Perl Extreme Perl (license not specified - home page is saying "freely available") The Mason Book (Open Publication License) Practical mod_perl (CreativeCommons Attribution Share-Alike License) Higher-Order Perl Learning Perl the Hard Way PHP Practical PHP Programming Zend Framework: Survive the Deep End PowerShell Mastering PowerShell Prolog Building Expert Systems in Prolog Adventure in Prolog Prolog Programming A First Course Logic, Programming and Prolog (2ed) Introduction to Prolog for Mathematicians Learn Prolog Now! Natural Language Processing Techniques in Prolog Python Dive Into Python Dive Into Python 3 How to Think Like a Computer Scientist A Byte of Python Python for Fun Invent Your Own Computer Games With Python Ruby Why's (Poignant) Guide to Ruby Programming Ruby - The Pragmatic Programmer's Guide Mr. Neighborly's Humble Little Ruby Book SQL Practical PostgreSQL x86 assembly Paul Carter's tutorial Lua Programming In Lua (for v5 but still largely relevant) Algorithms and Data Structures Algorithms Data Structures and Algorithms with Object-Oriented Design Patterns in Java Planning Algorithms Frameworks/Projects The Django Book The Pylons Book Introduction to Design Patterns in C++ with Qt 4 (Open Publication License) Version control The SVN Book Mercurial: The Definitive Guide Pro Git UNIX / Linux The Art of Unix Programming Linux Device Drivers, Third Edition Others Structure and Interpretation of Computer Programs The Little Book of Semaphores Mathematical Logic - an Introduction An Introduction to the Theory of Computation Developers Developers Developers Developers Linkers and loaders Beej's Guide to Network Programming Maven: The Definitive Guide I will expand on this list as I get comments or when I think of more :D Related: Programming texts and reference material for my Kindle What are some good free programming books? Can anyone recommend a free software engineering book? Edit: Oh I didn't notice the community wiki feature. Feel free to edit your suggestions right in!

    Read the article

  • How do you use kate? Tips/Tricks/Workflow

    - by Roman A. Taycher
    We all seen a bunch of these? Mostly for IDE's but also for vim and emacs. Kate is (only) a text editor (an awesome one) but it has a ton of options plus a number of plugins, so its hard to know all of it well. How do use the Kate text editor? Please share your workflow and help me and others learn some of the cool tricks you use. I'll start I use the built in terminal a lot opening files quickly, and using it as an enhanced haskell repl with ghci (since ghci doesn't allow you all to just put in all kinds of haskell code). Also use split views to quickly compare files (especially different versions of the same file). Also the auto-complete maybe simple(more use for saving typing time then remembering functions) but it works really well for that. Also if You highlight something and hit a start [/{/( it puts it in between brackets rather then replacing it with a bracket(why the hell do a lot of IDEs not have this feature).

    Read the article

  • The sieve of Eratosthenes in F#

    - by IVlad
    I am interested in an implementation of the sieve of eratosthenes in purely functional F#. I am interested in an implementation of the actual sieve, not the naive functional implementation that isn't really the sieve, so not something like this: let rec PseudoSieve list = match list with | hd::tl -> hd :: (PseudoSieve <| List.filter (fun x -> x % hd <> 0) tl) | [] -> [] The second link above briefly describes an algorithm that would require the use of a multimap, which isn't available in F# as far as I know. The Haskell implementation given uses a map that supports an insertWith method, which I haven't seen available in the F# functional map. Does anyone know a way to translate the given Haskell map code to F#, or perhaps knows of alternative implementation methods or sieving algorithms that are as efficient and better suited for a functional implementation or F#?

    Read the article

  • Windows 7 PATH not expanding

    - by trinithis
    I am using the following to create and edit environment variables for Windows 7. Control Panel\All Control Panel Items\System -> Advanced system settings -> Environment Variables Under System variables I have the following pertinant variables: PROG32=C:\Program Files (x86) REALDWG_SDK_DIR=%PROG32%\Autodesk\RealDWG 2011 Path=%REALDWG_SDK_DIR%;%PROG32%\Haskell\bin However, the following happens: C:\>echo %PROG32% C:\Program Files (x86) C:\>echo %Path% %REALDWG_SDK_DIR%;C:\Program Files (x86)\Haskell\bin Is it possible to have a chain of variables expand? If I rename Path to something else, I sometimes get the problem, and sometimes I don't.

    Read the article

  • Why are there no package management systems for C and C++?

    - by m0nhawk
    There are some programming languages for which exist their own package management systems: CTAN for TeX CPAN for Perl Pip & Eggs for Python Maven for Java cabal for Haskell Gems for Ruby Is there any other languages with such systems? What about C and C++? (that's the main question!) Why there are no such systems for them? And isn't creating packages for yum, apt-get or other general package management systems better?

    Read the article

  • category theory based language

    - by pagoda_5b
    It may sound naive, but is there any programming language, or research thereof, based entirely on category theory? I mean this as opposed to embedding CT concepts as an additional feature (like for Haskell or scala). Would it be too abstract or too complex as an approach, or are there any known reasons that makes it impossible or impractical? I have only a relative understanding of the theory as related to programming, so please give me some explanation if the question doesn't makes sense at all

    Read the article

  • Why aren't web frameworks simple, elegant and fun like programming languages? [on hold]

    - by Ryan
    When I think of pretty much any programming language - like C, C++, PHP, SQL, JavaScript, Python, ActionScript, Haskell, Lua, Lisp, Java, etc - I'm like awesome I would love to develop a computer application using any of those languages. But when I think of web frameworks(I do mostly PHP) - like Cake, CI, Symfony, Laravel, Zend, Drupal, Joomla, Wordpress, Rails, Django, etc - I'm like god no. Why aren't there web frameworks that provide me with simple, fun and powerful constructs like a programming language?

    Read the article

  • Is diversifying my programming knowledge good?

    - by the_great_monkey
    I have skills in so many programming languages, such as Java, C++, C, Obj-C, Scala, Haskell, and Matlab. However I don't know/like web programming at all. I also get bored very quickly. Thus I haven't work with any Java projects that's bigger than say 20-30 java files. I'm finishing off my degree and I want to work as a developer, particularly in mobile area. Do I have enough skills to be recruited by good companies?

    Read the article

  • generic programming- where did it originate?

    - by user997112
    Im trying to work out if generic programming was a functional programming feature which was then introduced into Java, C++ and C# or did the latter copy it from the functional programming languages like Haskell, Lisp, OCaml etc? Google is giving me lots on what generic programming is, but not where it originated. All I can see is that Ada implemented it early on. Would you class it as a functional programming technique?

    Read the article

  • Why don't languages include implication as a logical operator?

    - by Maciej Piechotka
    It might be a strange question, but why there is no implication as a logical operator in many languages (Java, C, C++, Python Haskell - although as last one have user defined operators its trivial to add it)? I find logical implication much clearer to write (particularly in asserts or assert-like expressions) then negation with or: encrypt(buf, key, mode, iv = null) { assert (mode != ECB --> iv != null); assert (mode == ECB || iv != null); assert (implies(mode != ECB, iv != null)); // User-defined function }

    Read the article

  • Ramda : une bibliothèque pour faire de la programmation fonctionnelle avec JavaScript, open source, elle inclut la curryfication automatique

    Ramda : une bibliothèque pour faire de la programmation fonctionnelle avec JavaScript Disponible en open source, elle inclut la curryfication automatique Les langages fonctionnels auraient-ils le vent en poupe ? C'est peut être le cas et certains développeurs repensent leurs applications à l'image d'IMVU qui a réécrit une partie du back-end de son application en langage Haskell, d'autres encore penchent pour le développement de bibliothèques dédiées à la programmation fonctionnelle en se basant...

    Read the article

  • The long road to bug-free software

    - by Tony Davis
    The past decade has seen a burgeoning interest in functional programming languages such as Haskell or, in the Microsoft world, F#. Though still on the periphery of mainstream programming, functional programming concepts are gradually seeping into the imperative C# language (for example, Lambda expressions have their root in functional programming). One of the more interesting concepts from functional programming languages is the use of formal methods, the lofty ideal behind which is bug-free software. The idea is that we write a specification that describes exactly how our function (say) should behave. We then prove that our function conforms to it, and in doing so have proved beyond any doubt that it is free from bugs. All programmers already use one form of specification, specifically their programming language's type system. If a value has a specific type then, in a type-safe language, the compiler guarantees that value cannot be an instance of a different type. Many extensions to existing type systems, such as generics in Java and .NET, extend the range of programs that can be type-checked. Unfortunately, type systems can only prevent some bugs. To take a classic problem of retrieving an index value from an array, since the type system doesn't specify the length of the array, the compiler has no way of knowing that a request for the "value of index 4" from an array of only two elements is "unsafe". We restore safety via exception handling, but the ideal type system will prevent us from doing anything that is unsafe in the first place and this is where we start to borrow ideas from a language such as Haskell, with its concept of "dependent types". If the type of an array includes its length, we can ensure that any index accesses into the array are valid. The problem is that we now need to carry around the length of arrays and the values of indices throughout our code so that it can be type-checked. In general, writing the specification to prove a positive property, even for a problem very amenable to specification, such as a simple sorting algorithm, turns out to be very hard and the specification will be different for every program. Extend this to writing a specification for, say, Microsoft Word and we can see that the specification would end up being no simpler, and therefore no less buggy, than the implementation. Fortunately, it is easier to write a specification that proves that a program doesn't have certain, specific and undesirable properties, such as infinite loops or accesses to the wrong bit of memory. If we can write the specifications to prove that a program is immune to such problems, we could reuse them in many places. The problem is the lack of specification "provers" that can do this without a lot of manual intervention (i.e. hints from the programmer). All this might feel a very long way off, but computing power and our understanding of the theory of "provers" advances quickly, and Microsoft is doing some of it already. Via their Terminator research project they have started to prove that their device drivers will always terminate, and in so doing have suddenly eliminated a vast range of possible bugs. This is a huge step forward from saying, "we've tested it lots and it seems fine". What do you think? What might be good targets for specification and verification? SQL could be one: the cost of a bug in SQL Server is quite high given how many important systems rely on it, so there's a good incentive to eliminate bugs, even at high initial cost. [Many thanks to Mike Williamson for guidance and useful conversations during the writing of this piece] Cheers, Tony.

    Read the article

  • The long road to bug-free software

    - by Tony Davis
    The past decade has seen a burgeoning interest in functional programming languages such as Haskell or, in the Microsoft world, F#. Though still on the periphery of mainstream programming, functional programming concepts are gradually seeping into the imperative C# language (for example, Lambda expressions have their root in functional programming). One of the more interesting concepts from functional programming languages is the use of formal methods, the lofty ideal behind which is bug-free software. The idea is that we write a specification that describes exactly how our function (say) should behave. We then prove that our function conforms to it, and in doing so have proved beyond any doubt that it is free from bugs. All programmers already use one form of specification, specifically their programming language's type system. If a value has a specific type then, in a type-safe language, the compiler guarantees that value cannot be an instance of a different type. Many extensions to existing type systems, such as generics in Java and .NET, extend the range of programs that can be type-checked. Unfortunately, type systems can only prevent some bugs. To take a classic problem of retrieving an index value from an array, since the type system doesn't specify the length of the array, the compiler has no way of knowing that a request for the "value of index 4" from an array of only two elements is "unsafe". We restore safety via exception handling, but the ideal type system will prevent us from doing anything that is unsafe in the first place and this is where we start to borrow ideas from a language such as Haskell, with its concept of "dependent types". If the type of an array includes its length, we can ensure that any index accesses into the array are valid. The problem is that we now need to carry around the length of arrays and the values of indices throughout our code so that it can be type-checked. In general, writing the specification to prove a positive property, even for a problem very amenable to specification, such as a simple sorting algorithm, turns out to be very hard and the specification will be different for every program. Extend this to writing a specification for, say, Microsoft Word and we can see that the specification would end up being no simpler, and therefore no less buggy, than the implementation. Fortunately, it is easier to write a specification that proves that a program doesn't have certain, specific and undesirable properties, such as infinite loops or accesses to the wrong bit of memory. If we can write the specifications to prove that a program is immune to such problems, we could reuse them in many places. The problem is the lack of specification "provers" that can do this without a lot of manual intervention (i.e. hints from the programmer). All this might feel a very long way off, but computing power and our understanding of the theory of "provers" advances quickly, and Microsoft is doing some of it already. Via their Terminator research project they have started to prove that their device drivers will always terminate, and in so doing have suddenly eliminated a vast range of possible bugs. This is a huge step forward from saying, "we've tested it lots and it seems fine". What do you think? What might be good targets for specification and verification? SQL could be one: the cost of a bug in SQL Server is quite high given how many important systems rely on it, so there's a good incentive to eliminate bugs, even at high initial cost. [Many thanks to Mike Williamson for guidance and useful conversations during the writing of this piece] Cheers, Tony.

    Read the article

  • What is faster- Java or C# (Or good old C)?

    - by Rexsung
    I'm currently deciding on a platform to build a scientific computational product on, and am deciding on either C#, Java, or plain C with Intels compiler on Core2 Quad CPU's. It's mostly integer arithmetic. My benchmarks so far show Java and C are about on par with each other, and dotNET/C# trails by about 5%- however a number of my coworkers are claiming that dotNET with the right optimizations will beat both of these given enough time for the JIT to do its work. I always assume that the JIT would have done it's job within a few minutes of the app starting (Probably a few seconds in my case, as it's mostly tight loops), so I'm not sure whether to believe them Can anyone shed any light on the situation? Would dotNET beat Java? (Or am I best just sticking with C at this point?). The code is highly multithreaded and data sets are several terabytes in size. Haskell/erlang etc are not options in this case as there is a significant quantity of existing legacy C code that will be ported to the new system, and porting C to Java/C# is a lot simpler than to Haskell or Erlang. (Unless of course these provide a significant speedup). Edit: We are considering moving to C# or Java because they may, in theory, be faster. Every percent we can shave off our processing time saves us tens of thousands of dollars per year. At this point we are just trying to evaluate whether C, Java, or c# would be faster.

    Read the article

  • How do you read a file line by line in your language of choice?

    - by Jon Ericson
    I got inspired to try out Haskell again based on a recent answer. My big block is that reading a file line by line (a task made simple in languages such as Perl) seems complicated in a functional language. How do you read a file line by line in your favorite language? So that we are comparing apples to other types of apples, please write a program that numbers the lines of the input file. So if your input is: Line the first. Next line. End of communication. The output would look like: 1 Line the first. 2 Next line. 3 End of communication. I will post my Haskell program as an example. Ken commented that this question does not specify how errors should be handled. I'm not overly concerned about it because: Most answers did the obvious thing and read from stdin and wrote to stdout. The nice thing is that it puts the onus on the user to redirect those streams the way they want. So if stdin is redirected from a non-existent file, the shell will take care of reporting the error, for instance. The question is more aimed at how a language does IO than how it handles exceptions. But if necessary error handling is missing in an answer, feel free to either edit the code to fix it or make a note in the comments.

    Read the article

  • Calculate the digital root of a number

    - by Gregory Higley
    A digital root, according to Wikipedia, is "the number obtained by adding all the digits, then adding the digits of that number, and then continuing until a single-digit number is reached." For instance, the digital root of 99 is 9, because 9 + 9 = 18 and 1 + 8 = 9. My Haskell solution -- and I'm no expert -- is as follows. digitalRoot n | n < 10 = n | otherwise = digitalRoot . sum . map (\c -> read [c]) . show $ n As a language junky, I'm interested in seeing solutions in as many languages as possible, both to learn about those languages and possibly to learn new ways of using languages I already know. (And I know at least a bit of quite a few.) I'm particularly interested in the tightest, most elegant solutions in Haskell and REBOL, my two principal "hobby" languages, but any ol' language will do. (I pay the bills with unrelated projects in Objective C and C#.) Here's my (verbose) REBOL solution: digital-root: func [n [integer!] /local added expanded] [ either n < 10 [ n ][ expanded: copy [] foreach c to-string n [ append expanded to-integer to-string c ] added: 0 foreach e expanded [ added: added + e ] digital-root added ] ] EDIT: As some have pointed out either directly or indirectly, there's a quick one-line expression that can calculate this. You can find it in several of the answers below and in the linked Wikipedia page. (I've awarded Varun the answer, as the first to point it out.) Wish I'd known about that before, but we can still bend our brains with this question by avoiding solutions that involve that expression, if you're so inclined. If not, Crackoverflow has no shortage of questions to answer. :)

    Read the article

  • Best (functional?) programming language to learn coming from Mathematica

    - by Will Robertson
    As a mechanical engineering PhD student, I haven't had a great pedigree in programming as part of my “day job”. I started out in Matlab (having written some Hypercard and Applescript back in the day, and being introduced to Ada, of all things, in my 1st undergrad year), learned to program—if you can call it that—in (La)TeX; and finally discovered and fell for Mathematica. Now I'm interested in learning a "real" programming language that I can enjoy in the same sort of style as Mathematica, which tries to stress functional programming since it seems to map more nicely to how certain kinds of mathematics can be written algorithmically. So which functional language should I learn? I guess the obvious answer is “as many as possible”, but let's start out humble and give a single, well-considered option a good crack. I've heard good things about, say, Haskell and Scala, but I wonder if (given my non–computer science background) I'd be better off starting in more “grounded” territory and going with Ruby or Python (the latter having the big advantage of being used for Sage, which I'd also like to investigate…after my PhD). Well, I guess this is pretty subjective, so perhaps I could rephrase: would it be better to start looking at Haskell (say) straight after an ad-hoc education to functional programming in Mathematica, or will I get more out of learning Python (say) first? In reference to the question "what do I want to do with it?", I guess my answer is "fun, and learning more". I've got this list of languages that I'd like to look at, and I don't know how to trim them down. And I'd rather start with something a little higher-level than C simply so that I can be somewhat productive without having to re-invent many wheels for any code I'd like to write.

    Read the article

  • How can I configure GIMP 2.8 to be a single window in XMonad?

    - by Pubby
    I'm trying to get GIMP to display as a single window in XMonad. Currently, it's floating strangely in front of every other display and I can't use it. I have tried reading this: http://www.haskell.org/haskellwiki/Xmonad/General_xmonad.hs_config_tips#Gimp But it seems this is for versions of GIMP before 2.8 when there wasn't the option to have GIMP use only 1 window. Because of this, it's an XMonad problem, not a GIMP one. How can I do this?

    Read the article

  • What Functional features are worth a little OOP confusion for the benefits they bring?

    - by bonomo
    After learning functional programming in Haskell and F#, the OOP paradigm seems ass-backwards with classes, interfaces, objects. Which aspects of FP can I bring to work that my co-workers can understand? Are any FP styles worth talking to my boss about retraining my team so that we can use them? Possible aspects of FP: Immutability Partial Application and Currying First Class Functions (function pointers / Functional Objects / Strategy Pattern) Lazy Evaluation (and Monads) Pure Functions (no side effects) Expressions (vs. Statements - each line of code produces a value instead of, or in addition to causing side effects) Recursion Pattern Matching Is it a free-for-all where we can do whatever the programming language supports to the limit that language supports it? Or is there a better guideline?

    Read the article

  • To maximize chances of functional programming employment

    - by Rob Agar
    Given that the future of programming is functional, at some point in the nearish future I want to be paid to code in a functional language, preferably Haskell. Assuming I have a firm grasp of the language, plus all the basic programmer attributes (good communication skills/sense of humour/hygiene etc), what should I concentrate on learning to maximize my chances? Are there any particularly sought after libraries I should know? Alternatively, would another language be a better bet, say F#? (I'm not too fussed about the kind of programming work, so long as it's reasonably interesting and reasonably well paid, and with nice people)

    Read the article

  • Selling your services when you use uncommon technologies

    - by speeder
    I took a look in Stackoverflow most popular profiles, and then I did the same in several other sites, and then I took a look in job postings in several boards, mostly out of curiosity, because I noticed this: If you work with Java, .NET or other managed languages, or you work with stuff that is popular for web development (Ruby, JavaScript, etc...) you can get lots of points on Stackoverflow, find lots of jobs and clients, find forums, friends, colleagues, etc... But how a programmer of uncommon languages (Lua, pure C, Lisp, D, ADA, Haskell, etc...) find information, sell his services, and so on? EDIT: This also applies to fields: You work with web, corporate software, database, etc... it is great... You dislike those previous 3, noone ever will hire your services...

    Read the article

  • Outdoor Programming Jobs...

    - by Rodrick Chapman
    Are there any kinds of jobs that require programming (or at least competency) but take place outdoors for a significant portion of the time? As long as I'm fantasizing, an ideal job would involve programming in a high level language like Haskell, F#, or Scala* for, say, 50% of the time and doing something like digging an irrigation trench the rest of the time. My background: I triple majored in mathematics, philosophy, and history (BS/BA) and have been working as a web developer for the past six years. I love hacking but I'm feeling a bit burned out. *I only chose these languages as examples since, ideally, I'd want to work among high caliber people... but it really doesn't matter.

    Read the article

  • What's special about currying or partial application?

    - by Vigneshwaran
    I've been reading articles on Functional programming everyday and been trying to apply some practices as much as possible. But I don't understand what is unique in currying or partial application. Take this Groovy code as an example: def mul = { a, b -> a * b } def tripler1 = mul.curry(3) def tripler2 = { mul(3, it) } I do not understand what is the difference between tripler1 and tripler2. Aren't they both the same? The 'currying' is supported in pure or partial functional languages like Groovy, Scala, Haskell etc. But I can do the same thing (left-curry, right-curry, n-curry or partial application) by simply creating another named or anonymous function or closure that will forward the parameters to the original function (like tripler2) in most languages (even C.) Am I missing something here? There are places where I can use currying and partial application in my Grails application but I am hesitating to do so because I'm asking myself "How's that different?" Please enlighten me.

    Read the article

  • Is functional programming a superset of object oriented?

    - by Jimmy Hoffa
    The more functional programming I do, the more I feel like it adds an extra layer of abstraction that seems like how an onion's layer is- all encompassing of the previous layers. I don't know if this is true so going off the OOP principles I've worked with for years, can anyone explain how functional does or doesn't accurately depict any of them: Encapsulation, Abstraction, Inheritance, Polymorphism I think we can all say, yes it has encapsulation via tuples, or do tuples count technically as fact of "functional programming" or are they just a utility of the language? I know Haskell can meet the "interfaces" requirement, but again not certain if it's method is a fact of functional? I'm guessing that the fact that functors have a mathematical basis you could say those are a definite built in expectation of functional, perhaps? Please, detail how you think functional does or does not fulfill the 4 principles of OOP.

    Read the article

< Previous Page | 21 22 23 24 25 26 27 28 29 30 31  | Next Page >