Search Results

Search found 10417 results on 417 pages for 'large'.

Page 139/417 | < Previous Page | 135 136 137 138 139 140 141 142 143 144 145 146  | Next Page >

  • The Best How-To Geek Articles for June 2011

    - by Asian Angel
    June has been a busy month here at How-To Geek where we covered topics like cleaning keyboards, what to do when your e-mail has been compromised, creating high resolution Windows 7 icons, and more. Join us as we look back at the most popular articles from this past month. Note: Articles are listed as #10 through #1. What You Said: How Do You Keep Notes? Note taking applications have grown increasingly sophisticated. Historically, when people took notes on a computer they simply used the word processor or text editor installed on it and left it at that—mostly because there were few widely available alternatives. While many people still use simple txt files for their note taking needs an entire ecosystem of note taking apps exists now—thanks, in large part, to the rise of widespread internet access and easy synchronization.How To Encrypt Your Cloud-Based Drive with BoxcryptorHTG Explains: Photography with Film-Based CamerasHow to Clean Your Dirty Smartphone (Without Breaking Something)

    Read the article

  • SQLIO Writes

    - by Grant Fritchey
    SQLIO is a fantastic utility for testing the abilities of the disks in your system. It has a very unfortunate name though, since it's not really a SQL Server testing utility at all. It really is a disk utility. They ought to call it DiskIO because they'd get more people using I think. Anyway, branding is not the point of this blog post. Writes are the point of this blog post. SQLIO works by slamming your disk. It performs as mean reads as it can or it performs as many writes as it can depending on how you've configured your tests. There are much smarter people than me who will get into all the various types of tests you should run. I'd suggest reading a bit of what Jonathan Kehayias (blog|twitter) has to say or wade into Denny Cherry's (blog|twitter) work. They're going to do a better job than I can describing all the benefits and mechanisms around using this excellent piece of software. My concerns are very focused. I needed to set up a series of tests to see how well our product SQL Storage Compress worked. I wanted to know the effects it would have on a system, the disk for sure, but also memory and CPU. How to stress the system? SQLIO of course. But when I set it up and ran it, following the documentation that comes with it, I was seeing better than 99% compression on the files. Don't get me wrong. Our product is magnificent, wonderful, all things great and beautiful, gets you coffee in the morning and is made mostly from bacon. But 99% compression. No, it's not that good. So what's up? Well, it's the configuration. The default mechanism is to load up a file, something large that will overwhelm your disk cache. You're instructed to load the file with a character 0x0. I never got a computer science degree. I went to film school. Because of this, I didn't memorize ASCII tables so when I saw this, I thought it was zero's or something. Nope. It's NULL. That's right, you're making a very large file, but you're filling it with NULL values. That's actually ok when all you're testing is the disk sub-system. But, when you want to test a compression and decompression, that can be an issue. I got around this fairly quickly. Instead of generating a file filled with NULL values, I just copied a database file for my tests. And to test it with SQL Storage Compress, I used a database file that had already been run through compression (about 40% compression on that file if you're interested). Now the reads were taken care of. I am seeing very realistic performance from decompressing the information for reads through SQLIO. But what about writes? Well, the issue is, what does SQLIO write? I don't have access to the code. But I do have access to the results. I did two different tests, just to be sure of what I was seeing. First test, use the .DAT file as described in the documentation. I opened the .DAT file after I was done with SQLIO, using WordPad. Guess what? It's a giant file full of air. SQLIO writes NULL values. What does that do to compression? I did the test again on a copy of an uncompressed database file. Then I ran the original and the SQLIO modified copy through ZIP to see what happened. I got better than 99% compression out of the SQLIO modified file (original file of 624,896kb went to 275,871kb compressed, after SQLIO it went to 608kb compressed). So, what does SQLIO write? It writes air. If you're trying to test it with compression or maybe some other type of file storage mechanism like dedupe, you need to know this because your tests really won't be valid. Should I find some other mechanism for testing? Yeah, if all I'm interested in is establishing performance to my own satisfaction, yes. But, I want to be able to compare my results with other people's results and we all need to be using the same tool in order for that to happen. SQLIO is the common mechanism that most people I know use to establish disk performance behavior. It'd be better if we could get SQLIO to do writes in some other fashion. Oh, and before I go, I get to brag a bit. Measuring IOPS, SQL Storage Compress outperforms my disk alone by about 30%.

    Read the article

  • How do I show a minimap in a 3D world?

    - by Bubblewrap
    Got a really typical use-case here. I have large map made up of hexagons and at any given time only a small section of the map is visible. To provide an overview of the complete map, I want to show a small 2D representation of the map in a corner of the screen. What is the recommended approach for this in libgdx? Keep in mind the minimap must be updated when the currently visible section changes and when the map is updated. I've found SpriteBatch(info here), but the warning label on it made me think twice: A SpriteBatch is a pretty heavy object so you should only ever have one in your program. I'm not sure I'm supposed to use the one SpriteBatch that I can have on the minimap, and I'm also not sure how to interpret "heavy" in this context. Another thing to possibly keep in mind is that the minimap will probably be part of a larger UI, is there any way to integrate these two?

    Read the article

  • Webcor Builders Coordinates Construction Schedules and Mitigates Potential Delays More Efficiently with Integrated Project Management

    - by Sylvie MacKenzie, PMP
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif";} With more than 40 years of commercial construction experience, Webcor Builders is a leading builder of distinguished, high-profile projects, including high-rise condominiums and hotels, laboratories, healthcare centers, and public works projects. Webcor is also known for its award-winning concrete, interior construction, historic restoration, and seismic renovation work. The company has completed more than 50 million square feet of projects to date. Considering the variety and complexity of the construction projects Webcor undertakes, an integrated project management solution is critical to ensuring optimal efficiency and completing client projects on time and on budget. The company previously used a number of scheduling systems for its various building projects. These packages provided different levels of schedule detail and required schedulers, engineers, and other employees to learn multiple systems. From an IT cost and complexity perspective, the company had to manage multiple scheduling systems and pay for multiple sets of licenses. The company looked to standardize on an enterprise project management system, and selected Oracle’s Primavera P6 Enterprise Project Portfolio Management. Webcor uses the solution’s advanced capabilities to schedule complex projects, analyze delays, model and propose multiple scenarios to demonstrate and mitigate delays and cost overruns, and process that information efficiently to deliver the scheduling precision that public and private projects require. In fact, the solution was instrumental in helping the company’s expansion into public sector projects during the recent economic downturn, and with Primavera P6 in place, it can deliver the precise schedule reporting required for large public projects. With Primavera P6 in place, the company could deliver the precise scheduling and milestone reporting capabilities required for large public projects. The solution is in managing the high-profile University of California – Berkeley Memorial Stadium project. Webcor was hired as construction manager and general contractor for the stadium renovation project, which is a fast-paced project located near the seismically active Hayward Fault Zone. Due to the University of California’s football schedule, meeting the Universities deadline for the coming season placed Webcor in a situation where risk awareness and early warnings of issues would be paramount. Webcor and the extended project team needed a solution that could instantly analyze alternate scenarios to mitigate potential delays; Primavera would deliver those answers.The team would also need to enable multiple stakeholders to use an internet-based platform to access the schedule from various locations, and model complicated sequencing requirements where swift decisions would be made to keep the project on track. The schedule is an integral part of Webcor’s construction management process for the stadium project. Rather than providing the client with the industry-standard monthly update, Webcor updates the critical path method (CPM) schedule on a weekly basis. The project team also reviews the schedule and updates weekly to confirm that progress and forecasted performance are accurate. Hired by the University for their ability to deliver in high risk environments The Webcor team was hit recently with a design supplement that could have added up to 70 days to the project. Using Oracle Primavera P6 the team sprung into action analyzing multiple “what if” scenarios to review mitigation means and methods.  Determined to make sure the Bears could take the field in the coming season the project team nearly eliminated the impact with their creative analysis in working the schedule. The total time from the issuance of the final design supplement to an agreed mitigation response was less than one week; leveraging the Oracle Primavera solution Webcor was able to deliver superior customer value With the ability to efficiently manage projects and schedules, Webcor can ensure it completes its projects on time and on budget, as well as inform clients about what changes to plans will mean in terms of delays and additional costs. Read the complete customer case study at :  http://www.oracle.com/us/corporate/customers/customersearch/webcor-builders-1-primavera-ss-1639886.html

    Read the article

  • Which JavaScript carousel zooms blocks from the playlist?

    - by Iain Hallam
    I saw a carousel/slider for displaying featured content a while ago that does something that most don't. It started fairly simply, with the top feature large, and a playlist to the side of other featured stories: Feature 1 then began to slide towards the bottom right, while feature 2 moved to occupy the main slot, and the previews of features 3 and 4 moved up: The slider had now completed a whole swap, and was ready to do the same thing with feature 3. My Google-fu seems to be lacking in finding this again; does anyone know of this slider? I think it was based on one of the frameworks, but I'm not sure whether it was jQuery or one of the others.

    Read the article

  • Easiest, most fun way to program 2D games? Flash? XNA? Some other engine?

    - by Maxi
    Hi, this is a post detailing my search for the most enjoyable way for a hobbyist game programmer to sweeten his free time with making a game. My requirements: I looked at Flash first, I made a couple of small games but I'm doubtful of the performance. I would like to make a fairly large strategy game, with several hundred units fighting simultaneously, explosions and animations included. Also zoomable maps. I saw that Adobe has a new 3D API for Flash, but I don't know if that improves 2D performance aswell, I couldn't find anything related to that question on their MAX10 sessions. Would you say that Flash is a good technology for making large 2D games easily? I really like Actionscript, and I love how easy everything is in Flash. There are several engines available which make it even easier. I just do this for fun, and it would be even better if there were proper animation/particle editors available and if the engine I were to use, would be available for multiple platforms. (so more people can play my game once finished). I'd like to have it available on many mobile platforms aswell. (because I love touch input for some reason) I do know the XNA framework pretty well, but there are no good engines available for it, and it will only run on Windows, which is a huge turn off. Even bigger is, that you need to install the XNA redistributable each time you want to give the game to someone. If I use XNA, I would have to make all the tools myself, and I'd probably have to make them with WPF. (I'd love to make tools with Adobe AIR, but unfortunately the API's for image manipulation etc. are far worse in Flash, than they are in XNA/WPF.) Now, I'm aware that I could make my own engine that supports each of those platforms, but quite frankly, that would be too much work plowing through APIs. After all, I want to make a game, not an engine. So the question becomes: Is there maybe a cross platform (free or free to develop?) engine available that I could use for 2D development? I prefer: C#, Actionscript. I don't mind using c++ if the toolset is above average, but I highly doubt that there is something out there like that. Please prove me wrong :) So summary: I'd like to use Flash, but I don't know if it scales well enough. I'm not a scripter, I want some real APIs that I can work with inside a proper IDE. Just for information, I looked at several alternatives, I'm actually looking for a long time already. You'd help me a lot to make a decision finally. Feature-wise the Flatredball engine would be ideal. But I tried their tools, and quite frankly, they are horrible. Absolutely unusable, I'd need to make my own for sure. I didn't look at their API, but if their tools are so bad, I'm not inclined to look further. Unity3D. This one is quite nice, but I really don't need 3D, and it is quite ...a lot of work to learn. I also don't like that it is so expensive to use for different platforms and that I can only code for it through scripting. You have to buy each platform separately. The editor usability is average, the product overall is good enough for most purposes, but learning it myself would be overkill. Shiva 3D. It looks good enough, but again: I don't really need 3D. The editor usability is a little worse than Unity3D in my opinion and it wasn't clear to me how to start programming. I think it requires C++ for coding, so that's a negative too. I want to have fun, and c# is fun ;) SDL. Quite frankly, I'd still need to port to all those different SDL implementations. And I don't like OpenGL style programming, it's just plain ugly. And it needs c++, I know that there might be some wrappers available, but I don't like to use wrappers, because... Irrlicht. A lot of features, but support seems to be low and it is aimed at enthusiasts. C# bindings get dropped repeatedly. I'm not an engine enthusiast, I just want to make a game. I don't see this happening with Irrlicht. Ogre3D. Way too much work, it's just a graphics engine. Also no multiple platform support and c++. Torque2D. Costs something to use, and I didn't hear a lot of good things about support and documentation. Also costs extra for each platform.

    Read the article

  • Pros and cons of developing modern services in Java

    - by r3mus
    I'm interested in the philosophical and architectural justification (or lack thereof) in using Java to develop in today's modern world (exclude mobile/embedded platforms of course). Why would one choose to develop (or not develop) a back-end in Java? Why would one choose to develop (or not develop) a front-end UI in Java? Why do large enterprises lean towards developing in Java rather than adopt more modern (and standardized) technologies? *disclaimer: I'm not a fan of Java in the enterprise, I'm simply curious what drives enterprises to continue the trend.

    Read the article

  • How can I monitor a website for malicious changes to the files

    - by user41421
    I had an occasion recently where our website was compromised - a link farm was added to a couple of the pages on one occasion, and on another occasion, a large and nasty aspx file was put on the server. I won't mention the host's name (Hostway), but I was pretty annoyed that someone was able to do this. No, it wasn't a leaky password - around 10 sites hosted by HW with consecutive IP addresses got trashed. Anyway. What I need is a utility or service (preferably free) that takes a snapshot of my websites contents, and then regularly monitors the files (size and datestamp) for unauthorized changes or additions, and alerts me. I've used web services that monitor one file for changes, but I'm looking for something a bit more aggressive.

    Read the article

  • Object-Oriented OpenGL

    - by Sullivan
    I have been using OpenGL for a while and have read a large number of tutorials. Aside from the fact that a lot of them still use the fixed pipeline, they usually throw all the initialisation, state changes and drawing in one source file. This is fine for the limited scope of a tutorial, but I’m having a hard time working out how to scale it up to a full game. How do you split your usage of OpenGL across files? Conceptually, I can see the benefits of having, say, a rendering class that purely renders stuff to screen, but how would stuff like shaders and lights work? Should I have separate classes for things like lights and shaders?

    Read the article

  • Document Link about Database Features on Exadata

    - by Bandari Huang
    DBFS on Exadata Exadata MAA Best Practices Series - Using DBFS on Exadata  (Internal Only) Oracle® DatabaseSecureFiles and Large Objects Developer's Guide 11g Release 2 (11.2) E18294-01 Configuring a Database for DBFS on Oracle Database Machine [ID 1191144.1] Configuring DBFS on Oracle Database Machine [ID 1054431.1] Oracle Sun Database Machine Setup/Configuration Best Practices [ID 1274318.1] - Verify DBFS Instance Database Initialization Parameters    DBRM on Exadata Exadata MAA Best Practices Series - Benefits and use cases with Resource Manager, Instance Caging, IORM  (Internal Only) Oracle® Database Administrator's Guide 11g Release 2 (11.2) E25494-02    

    Read the article

  • asp.net website development component / APIs

    - by Haseeb Asif
    I have been assigned a new website project to work on in my organization where my role demand to finalize all the tools/technologies/controls/api etc. That website will something like online store, where every user has his online store as subdomain e.g. user1.myprojectdomain.com I have been researching a number of things to use and need your suggestions in following levels ASP.NET web forms vs Asp.net MVC: Prefering asp.net webforms due to following reason with N Tier Architecture Rapid application Development large set of Toolbox/controls And mainly due to our team skill set Errorlogging Elmah seems to be a nice library Forums Forums Yetanotherforums On line Live Chat still looking for something (Working on SignalR) Signups with Social Media Engage by Janrain And I need help that how can Manage sub domains. Do we create a Virtual Directory/application for every user in the IIS on runtime or we can do some thing else

    Read the article

  • Translatability Guidelines for Usability Professionals

    - by ultan o'broin
    There is a clearly a demand for translatability guidelines aimed at usability professionals working in the enterprise applications space, judging by Google Analytics and the interest generated in the Twitterverse by my previous post on the subject. So let's continue the conversation. I'll flesh out each of the original points a bit more in posts over the coming weeks. Bear in mind that large-scale enterprise translation is a process. It needs to be scalable, repeatable, maintainable, and above meet the requirements of automation. That doesn't mean the user experience needs to suffer, however. So, stay tuned for some translatability best practices for usability professionals....

    Read the article

  • Using a back-end mechanism to copy files to DB and notify the application

    - by BDotA
    This Scenario: User copies large files to a local folder. I want to watch that folder and when a new file is dropped then go and copy it to Database, so later when coping is done I can actually use it in my application. ( A C# WinForms App). It would be awesome to also find a way to somehow get notified in the Application that hey copying the file to DB is finished and ready for use... I am using C#.net, Windows... What solutions/architecture do you suggest for this? For example having a windows service running all the time watching that folder, when something copied goes and write it to DB ... then how about getting notified? MSMQ is something I can use? don't know much about it yet. Thanks.

    Read the article

  • Salon LesJeudis.com Paris : plus de 2.000 postes IT à pourvoir ce jeudi 27 septembre, Rennes et Lyon suivront

    Salon LesJeudis.com Paris : plus de 2.000 postes IT à pourvoir cette semaine Le jeudi 27 septembre, Rennes et Lyon suivront LesJeudis.com organise un nouveau salon de recrutement ce jeudi 27 septembre au CNIT de la Défense de 11h à 21h. Pour les développeurs qui recherchent un emploi ou un nouveau poste, plus de 2.000 postes seront à pourvoir. Parmi les nombreuses entreprises qui seront à la recherche de candidats, citons les habituels Sogeti, ou Steria mais aussi dans un éventail assez large des entreprises comme Parrot, Michelin, OVH ou le PMU. Chaque candidat aura également la possibilité de soumettre son profil à un consultant en ressources humain...

    Read the article

  • How do I word my url so that it doesn't get blocked or appear spammy

    - by user18681
    I'm creating a fairly large site. Will my links appear spammy if I use the same word as in the pathfile in the url? For example: www.example.com/apples/great-apple-recipes www.example.com/apples/fresh-apple-pie www.example.com/apples/delicious-apple-turnovers I do not want my link to appear spammy. But is it ok if the keyword is almost always the same as in the pathfile on a huge site? Does the pathfile count as part of the keyword? Also, how many words in total should a url (including pathfile etc...) be?

    Read the article

  • Can Win32 message loops survive being ported to native linux?

    - by Chris Cochran
    I would like to port a large Win32 DLL to native linux in C++. I don't think I can use Wine for a DLL like mine, because users of the DLL would then also have to be in Wine, and then they would all whine... As a Windows C++ programmer, I don't (yet) have any familiarity with the GUI front-end services in linux, but if it logically runs on anything like win32 message loops, fonts, bitmaps, invalidation regions, getmessage( ) calls and so forth, it should be a fairly straight forward remapping of my existing code. So what am I looking at here, a remap or a rewrite? The path for such things must be well worn by now.

    Read the article

  • Library and several small programs that use it: how should I structure my git repository?

    - by Dan
    I have some code that uses a library that I and others frequently modify (usually only by adding functions and methods). We each keep a local fork of the library for our own use. I also have a lot of small "driver" programs (~100 lines) that use the library and are used exclusively by me. Currently, I have both the driver programs and the library in the same repository, because I frequently make changes to both that are logically connected (adding a function to the library and then calling it). I'd like to merge my fork of the library with my co-workers' forks, but I don't want the driver programs to be part of the merged library. What's the best way to organize the git repositories for a large, shared library that needs to be merged frequently and a number of small programs that have changes that are connected to changes in the library?

    Read the article

  • memory map huge file with boost

    - by HaveF
    I want to handle huge files(TB), after several searches, I find boost could be help boost/interprocess/file_mapping.hpp and I also find the demo code. Because the file that I read is too large(TB), so I think I should create a fixed-size of memory(say 1GB), and remap it when the data isn't on the page. But I don't know how to write this part. I only find another web page, which use "boost.iostreams" to handle this problem. I should use the boost.iostreams? or boost.interprocess.file_mapping? (if this one, please show me some codes), thanks!

    Read the article

  • Hello NHibernate! Quickstart with NHibernate (Part 1)

    - by BobPalmer
    When I first learned NHibernate, I could best describe the experience as less of a learning curve and more like a learning cliff.  A large part of that was the availability of tutorials.  In this first of a series of articles, I will be taking a crack at providing people new to NHibernate the information they need to quickly ramp up with NHibernate. For the first article, I've decided to address the gap of just giving folks enough code to get started.  No UI, no fluff - just enough to connect to a database and do some basic CRUD operations.  In future articles, I will discuss a repository pattern for NHibernate, parent-child relationships, and other more advanced topics. You can find the entire article via this Google Docs link: http://docs.google.com/Doc?docid=0AUP-rKyyUMKhZGczejdxeHZfOGMydHNqdGc0&hl=en Enjoy! -Bob

    Read the article

  • Will Tracking Subdomains as Single Entity with Google Analytics Help SEO? [closed]

    - by Sam Gridley
    Possible Duplicate: Does Google Analytics data affect SEO? We have two subdomains, one for our blog and one for our ecommerce store. The blog serves to bring traffic and the store is how we monetize the site. We have them designed to appear as one large site, but I know google sees them as two sites. Here is how the subdomains look: www.example.com (store) blog.example.com (blog) I believe I can configure analytics to use subdomain tracking as explained here: http://support.google.com/googleanalytics/bin/answer.py?hl=en&answer=55524 But my question is whether this will cause google to see our 2 subdomains as one larger domain for SEO purposes. In other words, is there any relationship to how you configure google analytics and how google indexes and ranks your website(s) and pages? Is there anything I need to do in anaytics or webmaster tools to make google aware that these two subdomains work together as one website? Thanks! Sam

    Read the article

  • Déclin des ventes de processeurs Atom par Intel, l'âge d'or des netbooks est-il en train de s'acheve

    Mise à jour du 27.04.2010 par Katleen Déclin des ventes de processeurs Atom par Intel, l'âge d'or des netbooks est-il en train de s'achever ? Les spécialistes de l'analyse de marché de chez IDC pensent que le phénomène des netbooks à atteint son apogée. Des chiffres provenant d'Intel confirmeraient cette hypothèse. En effet, les ventes de processeurs Intel Atom pour appareils mobiles sont en déclin. Cette chute inverse la tendance des derniers mois où la puce représentait un large pourcentage des exports de processeurs mobiles. Intel envoie en effet la majorité de ses processeurs Atom à des fabriquants de Netbooks, ces ordinateurs portables miniatures à prix réduit.

    Read the article

  • What should a game have in order to keep humans playing it?

    - by Adam Davis
    In many entertainment professions there suggestions, loose rules, or general frameworks one follows that appeal to humans in one way or another. For instance, many movies and books follow the monomyth. In video games I find many types of games that attract people in different ways. Some are addicted to facebook gem matching games. Others can't get enough of FPS games. Once in awhile, though, you find a game that seems to transcend stereotypes and appeals almost immediately to everyone that plays it. For instance, Plants Versus Zombies seems to have a very, very large demographic of players. There are other games similar in reach. I'm curious what books, blogs, etc there are that explore these game types and styles, and tries to suss out one or more popular frameworks/styles that satisfy people, while keeping them coming back for more.

    Read the article

  • What measures can be taken to increase Google indexing speed for a given newly created page?

    - by knorv
    Consider a website with a large number of pages. New pages are published regularly. When publishing a new page the website operator wants to get the newly created paged indexed in Google as soon as possible. The website operator wants to minimize the time spent between publication and indexing. Consider the site http://www.example.com/ with hundreds of thousands of pages. The page page http://www.example.com/something/important-page.html is created at say 12:00. How do I get important-page.html indexed as soon as possible after 12:00? Ideally within seconds or minutes. Or more generally: What options are available to try to get Google to index a specific newly created page as soon as possible?

    Read the article

  • Which hidden files and directories do I need?

    - by Sammy Black
    In a previous question, I explained my situation/plan: backing up home directory on external drive, reformatting laptop drive, installing 14.04, putting home directory back. (It hasn't happened yet because I can't seem to find the down time, in case things aren't working right away.) It occurred to me that maybe I don't want all of those hidden files and directories (e.g. .local/share/ubuntuone/syncdaemon/, .cache/google-chrome/, etc.) Just judging by the amount of time in copying, I can tell that some of these hidden directories are large. Question: Are there any hidden directories that I obviously don't need/want when I have the laptop running an updated distribution? Will they cause conflicts? (I plan on copying the backed-up directory tree back onto the laptop with the --no-clobber option.)

    Read the article

  • Methods of ordering function definitions in code

    - by xralf
    When I work on some programming project (usually command line application in Python with many switches), I'm usually creating about 30 and more functions. Most of the functions are in one file (except some helpers that I utilize in more projects). Some of the functions are called on particular switch (like -p or --print) but many functions do some helper computations, print operations or database operations because I don't want to main functions be too large. When I have an idea for a new functionality I often put new functions randomly to the file. Should I think more about it and place it to some particular place? Are there some methods for this?

    Read the article

< Previous Page | 135 136 137 138 139 140 141 142 143 144 145 146  | Next Page >