Search Results

Search found 30894 results on 1236 pages for 'best practice'.

Page 357/1236 | < Previous Page | 353 354 355 356 357 358 359 360 361 362 363 364  | Next Page >

  • Sizing a Virtual Server

    - by vdubs
    I would like to replace four aging physical servers with one virtual server. What is the best way to insure the VM server is sized correctly? The requirements of the apps that will be running on the four servers are APPLICATION SERVERS - QTY 3 - These will run the application layer for the web server, Business Objects Business Intelligence app, and various other small client server apps. The three most heavy hitting apps each have the following server requirements. So, if I bought three physical servers, this would be the requirements for each of them Processor - Dual 2.83 GHZ (or faster) Ram - 4 GB Raid 5 - 50-100GB usable space NIC - 1 GB Web Server - this will run one asp.net e-business app that will talk to our dedicated SQL server and the three app servers above. The E-Business software has these requirements for the web server Processor - Quad 2.83 GHZ (or faster) Ram - 8 GB Raid 5 - 50-100GB usable space NIC - 1 GB What is the best tool to determine what I need from a hardware standpoing in a virtual server? I am planning on using VMWare.

    Read the article

  • Database Insider Newsletter Helps Oracle Achieve Maggie Award Bid

    - by jenny.gelhausen
    The Database Insider team is honored to have our monthly newsletter help Oracle be nominated as a 2010 Maggie Award finalist. The Maggie Awards, known as the "Oscars" of the periodicals, are recognition of excellence to deserving individuals and companies whose work is deemed "The Best in the West" in a wide variety of publishing categories. The list of 2010 Maggie Award finalists is impressive and includes some past champions - so win or lose, the Database Insider team is thrilled to have helped Oracle achieve this finalist nomination in the category of Best Web E-Newsletter/Trade & Consumer. Thanks to all our faithful readers and subscribers. Haven't seen our newsletter yet? Read the latest Database Insider Newsletter edition. We invite you to subscribe and joins others receiving the Oracle Database Insider Newsletter in their Inbox, click here to register to start receiving your monthly Database Insider newsletter. Under Oracle Communications check the box next to: Oracle Database Insider - All about Oracle Database features and options including news and analysis, reviews, customer stories, events, offers, and more. Monthly. See sample. var gaJsHost = (("https:" == document.location.protocol) ? "https://ssl." : "http://www."); document.write(unescape("%3Cscript src='" + gaJsHost + "google-analytics.com/ga.js' type='text/javascript'%3E%3C/script%3E")); try { var pageTracker = _gat._getTracker("UA-13185312-1"); pageTracker._trackPageview(); } catch(err) {}

    Read the article

  • DevDays ‘00 The Netherlands day #2

    - by erwin21
    Day 2 of DevDays 2010 and again 5 interesting sessions at the World Forum in The Hague. The first session of the today in the big world forum theater was from Scott Hanselman, he gives a lap around .NET 4.0. In his way of presenting he talked about all kind of new features of .NET 4.0 like MEF, threading, parallel processing, changes and additions to the CLR and DLR, WPF and all new language features of .NET 4.0. After a small break it was ready for session 2 from Scott Allen about Tips, Tricks and Optimizations of LINQ. He talked about lazy and deferred executions, the difference between IQueryable and IEnumerable and the two flavors of LINQ syntax. The lunch was again very good prepared and delicious, but after that it was time for session 3 Web Vulnerabilities and Exploits from Alex Thissen. This was no normal session but more like a workshop, we decided what kind of subjects we discussed, the subjects where OWASP, XSS and other injections, validation, encoding. He gave some handy tips and tricks how to prevent such attacks. Session 4 was about the new features of C# 4.0 from Alex van Beek. He talked about Optional- en Named Parameters, Generic Co- en Contra Variance, Dynamic keyword and COM Interop features. He showed how to use them but also when not to use them. The last session of today and also the last session of DevDays 2010 was about WCF Best Practices from Gerben van Loon. He talked about 7 best practices that you must know when you are going to use WCF. With some quick demos he showed the problem and the solution for some common issues. It where two interesting days and next year i sure will be attending again.

    Read the article

  • Connecting Small business network to Azure Site to Site VPN

    - by MarkKGreenway
    Would like to have connectivity between azure virtual machines and on LAN users. My current network has a Cisco ISA550 connected to the WAN (one Ethernet cable into the office the fiber transceiver is on a different floor)and any public servers can be one-to one NAT-ed to have a public and private IP. What is the best way to get a reliable connection. Between end users and the cloud? I want to know the preferred on site endpoint. Do the azure vm's have to have a local ip in the LAN subnet? (Right now 10.10.0.0/20 or 255.255.240.0 to give room if this is the case). If in purchased an asa550 would I put it behind or in front of the isa550. Would it be ahead or peer with the users switches? What is the best way to get a reliable connection. Between end users and the cloud servers?

    Read the article

  • I/O intensive MySql server on Amazon AWS

    - by rhossi
    We recently moved from a traditional Data Center to cloud computing on AWS. We are developing a product in partnership with another company, and we need to create a database server for the product we'll release. I have been using Amazon Web Services for the past 3 years, but this is the first time I received a spec with this very specific hardware configuration. I know there are trade-offs and that real hardware will always be faster than virtual machines, and knowing that fact forehand, what would you recommend? 1) Amazon EC2? 2) Amazon RDS? 3) Something else? 4) Forget it baby, stick to the real hardware Here is the hardware requirements This server will be focused on I/O and MySQL for the statistics, memory size and disk space for the images hosting. Server 1 I/O The very main part on this server will be I/O processing, FusionIO cards have proven themselves extremely efficient, this is currently the best you can have in this domain. o Fusion ioDrive2 MLC 365GB (http://www.fusionio.com/load/-media-/1m66wu/docsLibrary/FIO_ioDrive2_Datasheet.pdf) CPU MySQL will use less CPU cores than Apache but it will use them very hard, the E7 family has 30M Cache L3 wichi provide boost performance : o 1x Intel E7-2870 will be ok. Storage SAS will be good enough in terms of performance, especially considering the space required. o RAID 10 of 4 x SAS 10k or 15k for a total available space of 512 GB. Memory o 64 GB minimum is required on this server considering the size of the statistics database. Warning: the statistics database will grow quickly, if possible consider starting with 128 GB directly, it will help. This server will be focused on I/O and MySQL for the statistics, memory size and disk space for the images hosting. Server 2 I/O The very main part on this server will be I/O processing, FusionIO cards have proven themselves extremely efficient, this is currently the best you can have in this domain. o Fusion ioDrive2 MLC 365GB (http://www.fusionio.com/load/-media-/1m66wu/docsLibrary/FIO_ioDrive2_Datasheet.pdf) CPU MySQL will use less CPU cores than Apache but it will use them very hard, the E7 family has 30M Cache L3 wichi provide boost performance : o 1x Intel E7-2870 will be ok. Storage SAS will be good enough in terms of performance, especially considering the space required. o RAID 10 of 4 x SAS 10k or 15k for a total available space of 512 GB. Memory o 64 GB minimum is required on this server considering the size of the statistics database. Warning: the statistics database will grow quickly, if possible consider starting with 128 GB directly, it will help. Thanks in advance. Best,

    Read the article

  • Practical RAID Performance?

    - by wag2639
    I've always thought the following to be a general rule of thumb for RAID: RAID 0: Best performance for READ and WRITE from stripping, greatest risk RAID 1: Redundant, decent for READ (I believe it can read from different parts of a file from different hard drives), not the best for WRITE RAID 0+1 (01): combines redundancy of RAID 1 with performance of RAID 0 RAID 1+0 (10): slightly better version of RAID 0+1 RAID 5: good READ performance, bad WRITE performance, redundant IS THIS ASSUMPTION CORRECT? (and how do they compare to a JBOD setup for R/W IO performance) Are certain practical RAID setups better for different applications: gaming, video editing, database (Acccess or SQL)? I was thinking about hard disk drives but does this apply to solid state drives as well?

    Read the article

  • Announcing the MOS WCI "Community"

    - by brian.harrison
    The WCI Technical Support team are please to announce the launch of the long awaited WCI Support Community on My Oracle Support (MOS) "Community". Users can navigate to this "first stop" for WebCenter Interaction information by logging on to following this link: WCI Community (Note that this requires a valid login credential to the My Oracle Support tool). In this community you'll find a product related discussion forum moderated by Oracle WebCenter Interaction support engineers, recommended tips and tricks, links to knowledge base articles and best practices for setting up and administering up your environment. We hope you'll take a minute to have a look through the community. If you have a question about WebCenter Interaction, a comment or a suggestion regarding the content, please feel free to post it to the forum and someone will respond to your request. Think of the forum here as another method to communicate directly with the WCI Technical Support team for questions and answers to simple WCI support topics. The forum is moderated by WCI Technical Support engineers directly and we hope it will help you avoid the need to log support incidents for less complex support related questions. We encourage all of our customers, both internal and external, to participate in the forums discussions, sharing information, knowledge, best practices and in the effort to help us build a vital and vibrant "home base" for WCI users on the My Oracle Support tool. Thank you for visiting! The WebCenter Interaction Support Community Moderator Team

    Read the article

  • Multidimensional Thinking–24 Hours of Pass: Celebrating Women in Technology

    - by smisner
    It’s Day 1 of #24HOP and it’s been great to participate in this event with so many women from all over the world in one long training-fest. The SQL community has been abuzz on Twitter with running commentary which is fun to watch while listening to the current speaker. If you missed the fun today because you’re busy with all that work you’ve got to do – don’t despair. All sessions are recorded and will be available soon. Keep an eye on the 24 Hours of Pass page for details. And the fun’s not over today. Rather than run 24 hours consecutively, #24HOP is now broken down into 12-hours over two days, so check out the schedule to see if there’s a session that interests you and fits your schedule. I’m pleased to announce that my business colleague Erika Bakse ( Blog | Twitter) will be presenting on Day 2 – her debut presentation for a PASS event. (And I’m also pleased to say she’s my daughter!) Multidimensional Thinking: The Presentation My contribution to this lineup of terrific speakers was Multidimensional Thinking. Here’s the abstract: “Whether you’re developing Analysis Services cubes or creating PowerPivot workbooks, you need to get into a multidimensional frame of mind to produce a model that best enables users to answer their business questions on their own. Many database professionals struggle initially with multidimensional models because the data modeling process is much different than the one they use to produce traditional, third normal form databases. In this session, I’ll introduce you to the terminology of multidimensional modeling and step through the process of translating business requirements into a viable model.” If you watched the presentation and want a copy of the slides, you can download a copy here. And you’re welcome to download the slides even if you didn’t watch the presentation, but they’ll make more sense if you did! Kimball All the Way There’s only so much I can cover in the time allotted, but I hope that I succeeded in my attempt to build a foundation that prepares you for starting out in business intelligence. One of my favorite resources that will get into much more detail about all kinds of scenarios (well beyond the basics!) is The Data Warehouse Toolkit (Second Edition) by Ralph Kimball. Anything from Kimball or the Kimball Group is worth reading. Kimball material might take reading and re-reading a few times before it makes sense. From my own experience, I found that I actually had to just build my first data warehouse using dimensional modeling on faith that I was going the right direction because it just didn’t click with me initially. I’ve had years of practice since then and I can say it does get easier with practice. The most important thing, in my opinion, is that you simply must prototype a lot and solicit user feedback, because ultimately the model needs to make sense to them. They will definitely make sure you get it right! Schema Generation One question came up after the presentation about whether we use SQL Server Management Studio or Business Intelligence Development Studio (BIDS) to build the tables for the dimensional model. My answer? It really doesn’t matter how you create the tables. Use whatever method that you’re comfortable with. But just so happens that it IS possible to set up your design in BIDS as part of an Analysis Services project and to have BIDS generate the relational schema for you. I did a Webcast last year called Building a Data Mart with Integration Services that demonstrated how to do this. Yes, the subject was Integration Services, but as part of that presentation, I showed how to leverage Analysis Services to build the tables, and then I showed how to use Integration Services to load those tables. I blogged about this presentation in September 2010 and included downloads of the project that I used. In the blog post, I explained that I missed a step in the demonstration. Oops. Just as an FYI, there were two more Webcasts to finish the story begun with the data – Accelerating Answers with Analysis Services and Delivering Information with Reporting Services. If you want to just cut to the chase and learn how to use Analysis Services to build the tables, you can see the Using the Schema Generation Wizard topic in Books Online.

    Read the article

  • Is Joel Test really a good gauging tool?

    - by henry
    I just learned about Joel Test. I have been computer programmer for 22 years, but somehow never heard about it before. I consider my best job so far to be this small investment managing company with 30 employees and only 3 people in IT department. I am no longer with them but I had being working there for 5 years – my longest streak with any given company. To my surprise they scored extremely poor on Joel Test. The only two questions I would answer “yes” are #4: Do you have a bug database? And #9: Do you use the best tools money can buy? Everything else is either “sometimes” or straight “no”. Here is what I liked about the company however: a) Good pay, they bragged about it to my face and I bragged about it to their face, so it was almost like a family environment. b) I always knew big picture. When writing a code to solve particular problem there were no ambiguity about the business nature of that problem. Even though we did not always had written specs we could ask business users a question anytime, often yelling it across the floor. I could even talk to executives any time I felt like doing it: no appointment necessary. c) Immediate feedback. Once we implement a solution and make business users happy they immediately let us know that, we (programmers) become heroes of the moment. d) No red tape. I could always buy any tools I deem necessary, and design solutions the way my professional judgment dictates. e) Flexibility. If I had mid-day dental appointment that is near my house rather than near the office, I would send email to the company: "FYI: I work from home today". As long as one of 3 IT guys was on the floor (to help traders in case their monitors go dark) they did not care where 2 others are. So the question thus becomes how valuable Joel Test is? Why bother with it?

    Read the article

  • SQL Server for the Oracle DBA Links

    - by BuckWoody
    I do a presentation (and a class) called "SQL Server for the Oracle DBA". It's a non-marketing overview that gives you the basics of working with SQL Server if you're already familiar wtih how Oracle works. This class and these links DO NOT help you with "Why should I use Oracle/SQL Server instead of Oracle/SQL Server" - I'll assume you're already there, and if not, there are LOTS of sites to help you make that decision. Although these links might contain slight marketing slants (I don't control them) I've tried to get the best links I can. Feel free to comment here to add more/better links. As such, these aren't links that help you work with Oracle - they are links to help you work with SQL Server. Some of them contain more information than you actually need, others don't have near enough. Taken together (and with the class) you're able to get done what you need to do. "Practical SQL Server for Oracle Professionals" - A Microsoft Whitepaper, probably the best place to get started: http://download.microsoft.com/download/6/9/d/69d1fea7-5b42-437a-b3ba-a4ad13e34ef6/SQLServer2008forOracle.docx Free Training: http://technet.microsoft.com/en-us/sqlserver/dd548020.aspx Classroom training (will cost you): http://www.microsoft.com/learning/en/us/course.aspx?ID=50068A&locale=en-us Terminology Differences: http://www.associatedcontent.com/article/2383466/oracle_and_sql_server_basic_terminology.html Datatype mapping between Oracle and SQL Server: http://msdn.microsoft.com/en-us/library/ms151817.aspx The "other" direction - can still be useful for the Oracle professional to see the other side: http://blog.benday.com/archive/2008/10/23/23195.aspx Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • How much detail is in a good UI regression test?

    - by GlenPeterson
    We use a detailed step-by-step user-interface regression test for our commercial web application. It has a "backbone" test for the most used / most important parts of the system, with optional tests for specific areas of functionality. Using this plan has definitely helped us ensure high quality software. But, having very specific tests can be counter-productive. The tester concentrates on following the test and will completely miss usability issues, or not notice fairly obvious problems such as the bottom part of a page that is missing. By contrast, some of the best UI testing happens when building a demo of a new feature. I often do my own best testing by pretending to demonstrate the system to an imaginary prospect. Yet when I tell the testers, "Just demonstrate the system to yourself" they don't cover nearly as much functionality as they do with a detailed point-by-point test. I'm repeatedly asked to provide more and more detail in the test plan so that a new untrained tester can test with it without asking any questions. Yet details seem to be counter-productive. How much detail do you put in a regression test to make it effective? What techniques make the tester to focus more on the system than on checking off items on the test?

    Read the article

  • Is my first employer expecting too much?

    - by priyank patel
    This is my first job as a programmer. I am working using the followig technologies: ASP.NET C# HTML CSS Javascript JQuery I work for a firm which develops software for small banking firms. Currently they have their software running in 100 firms. Their software is developed in Visual Fox Pro. I was hired to develop an online version of this software. I am the only developer. My boss is another developer, the only other developer in the firm. Therefore, my employer has a total of two developers. My boss does not have any experience with .NET development. I have been working on this project for 8 months. The progress is there, but has been very slow. I try my best to do what my boss asks. But the project just seems too ambitious for me. The company has not done have any planning for the project. They just ask me to develop what their older software provides. So I have to deal with front end, back end, review code, design architecture, and more. I have decided to give my best. I try a lot. But the project sometimes just seems to be overwhelming. Question: Is it normal for a beginner programmer to be in this place? Are my employers just expecting too much of a new programmer? As a programmer, am I lacking skills one needs to deal with this? I always feel the need to work in at least a small team, if not big one. I am just not able judge my condition. Also I am paid very low salary. I do work on Saturday as well. Please, help to clarify my judgment. Any suggestions are welcome.

    Read the article

  • SCVMM upgrade scenario

    - by pigeon
    I've read some information on TechNet about upgrading SCVMM 2008 - 2012 but can't quite figure out the best way to approach this. The current setup is that we've got SCVMM 2008 R2 installed but against best practice it was actually installed on the Hyper-V host machine since its a small scale deployment its just a single server setup with SCVMM existing on the same host rather than be in a VM. So from what I've read an in-place should be possible which will incur a restart but also don't have the luxury of another server to shift the VMs onto whilst doing this or want to risk anything happening to the Hyper-V role. Ideally I would probably prefer just to get SCVMM 2012 into a VM of its own and remove the 2008 version from the host machine. Anyone done an upgrade on this or have any recommendations about how to approach this?

    Read the article

  • Benchmark an SSD

    - by Taylor Huston
    What is the best way to test/benchmark an SSD to make sure it's doing it's job. I invested in an SSD, have my OS on it, want to make sure I am getting my money's worth. I have heard some people making claims along the lines of: "I've had my SSD for X months and my read/write speeds have dropped Y%'. What is the best way to test for things like that (and what are good numbers to look for)? For reference I have a Samsung 830 128gb.

    Read the article

  • Planning for the Recovery

    - by john.orourke(at)oracle.com
    As we plan for 2011, there are many positive signs in the global economy, but also some lingering issues. Planning no longer is about extrapolating past performance and adjusting for growth. It is now about constantly testing the temperature of the water, formulating scenarios, assessing risk and assigning probabilities.  So how does one plan for recovery and improve forecast accuracy in such a volatile environment?  Here are some suggestions from a recent article I wrote, which was published in the December Financial Planning & Analysis (FP&A) newsletter from the AFP (Association of Financial Professionals): Increase the frequency of forecasting Get more line managers involved in the planning and forecasting process Re-consider what's being measured - i.e. key financial and operational metrics Incorporate risk and probability into forecasts Reduce reliance on spreadsheets - leverage packaged EPM applications To learn more about these best practices, check out the FP&A section of the AFP website and register to receive the FP&A newsletter.  AFP recently launched a new topic area focused on the FP&A function and items of interest to this group of finance professionals.  In addition to the FP&A quarterly newsletter, AFP will be publishing articles, running webinars and will have an FP&A track in their annual conference, which is in Boston next November.  Brian Kalish, AFP's Finance Lead, is hoping this initiative creates a valuable networking and information-sharing resource for FP&A professionals. Here's a link to the FP&A page on the AFP web site:  http://www.afponline.org/pub/res/topics/topics_fpa.html If you register on the site you can access and subscribe to the FP&A newsletter and other resources. Best of luck in your planning for 2011 and beyond!   

    Read the article

  • where to look for computer technician jobs

    - by Kareem
    Hi I am currently studying for the A+ certification, I plan to have it by the end of this month and I plan to go for farther education. I’ve built two high end computers by myself for a friend and family member. Install OS and everything. I’m looking in to finding either a computer assembly or computer technician job . Where is the best place to look for one? I’ve looked in to best buy but I find their geek squad to be a little bit shady. Where is a good place to look for a full time entry level computer technician job just starting out in Tampa, FL?

    Read the article

  • Proper Data Structure for Commentable Comments

    - by Wesley
    Been struggling with this on an architectural level. I have an object which can be commented on, let's call it a Post. Every post has a unique ID. Now I want to comment on that Post, and I can use ID as a foreign key, and each PostComment has an ItemID field which correlates to the Post. Since each Post has a unique ID, it is very easy to assign "Top Level" comments. When I comment on a comment however, I feel like I now need a PostCommentComment, which attaches to the ID of the PostComment. Since ID's are assigned sequentially, I can no longer simply use ItemID to differentiate where in the tree the comment is assigned. I.E. both a Post and a Post Comment might have an ID of '5', so my foreign key relationship is invalid. This seems like it could go on infinitely, with PostCommentCommentComment's etc... What's the best way to solve this? Should I have a field in the comment called "IsPostComment" or something of the like to know which collection to attach the ID to? This strikes me as the best solution I've seen so far, but now I feel like I need to make recursive DataBase calls which start to get expensive. Meaning, I get a Post and get all PostComments where ItemID == Post.ID && where IsPostComment == true Then I take that as a collection, gather all the ID's of the PostComments, and do another search where ItemID == PostComment[all].ID && where IsPostComment == false, then repeat infinitely. This means I make a call for every layer, and if I'm calling 100 Posts, I might make 1000 DB calls to get 10 layers of comments each. What is the right way to do this?

    Read the article

  • how to stop outgoing email spam

    - by James
    running an email system using roundcube, with about 200 people using it. 99% of them do as they are told and only email clients they have already spoken to, however 1% of them decide to bulk spam bcc emails, which then tripped an aol filter and almost got us banned from our host. I have disabled the guys account but I am worried about something similar happening in the future, what would be the best way to stop this? I read that if aol recieve 3 emails within 60 seconds from the same ip address then its an instant ban, so i am guessing with the big companies like google, their email accounts must have different ip addresses? and if so is there any way to implement a similar feature? Also i have spam assasin enabled, in this case what would be the best configuration for it?

    Read the article

  • FoxTales: Behind the Scenes at Fox Software by Kerry Nietz

    Flash backs from the past! It's truly amazing to discover that software development from freshman to senior level as well as project management hasn't changed that much. Kerry Nietz describes his memoir from his final year at college to his first job at Fox Software to 'an early retirement' at Microsoft. This title also brought his other fictional novels to my attention. Once again here is the review I published on Amazon: Built to last! I have been around in software development for more than a decade now but honestly I have to admit it is only now that I took the opportunity to read about the history of my used to be primary programming language. In fact, I started with Visual FoxPro 6 back in 1999 and went only down to FoxPro for Windows 2.6 during migration projects - long after the stories described in this title. It is really interesting to see how they actually managed to create a great product with such a small team of developers. "Create the best Report Writer in the world, out of only sawdust, bubblegum, and dreams." - That's the best sentence I'm going to quote from this title in the future. An inspiration to achieve the impossible, only by taking small steps. Just begin the journey - one step after the next one. If you fall, stand up and continue to walk. Kerry takes the reader on an amazing trip through almost 4 years working at a small software company in Perrysburg, Ohio. That went from a another 'look-alike' of the mighty Ashton-Tate dBase to the leading force in database development, long before Microsoft Access (project name: Cirrus) was even finished. It survived Borland Paradox and even nowadays Visual FoxPro is still in daily use in thousands of companies world-wide. Actually, I'm glad that I had the chance to foster my programming knowledge with Visual FoxPro. After his excellent work in software development, Kerry went for a second career as a writer. I'm looking forward to read his other titles soon:

    Read the article

  • Writing cross-platforms Types, Interfaces and Classes/Methods in C++

    - by user827992
    I'm looking for the best solution to write cross-platform software, aka code that I write and that I have to interface with different libraries and platforms each time. What I consider the easiest part, correct me if I'm wrong, is the definition of new types, all I have to do is to write an hpp file with a list of typedefs, I can keep the same names for each new type across the different platforms so my codebase can be shared without any problem. typedefs also helps me to redefine a better scope for my types in my code. I will probably end up having something like this: include |-windows | |-types.hpp |-linux | |-types.hpp |-mac |-types.hpp For the interfaces I'm thinking about the same solution used for the types, a series of hpp files, probably I will write all the interfaces only once since they rely on the types and all "cross-platform portability" is ensured by the work done on the types. include | |-interfaces.hpp | |-windows | |-types.hpp |-linux | |-types.hpp |-mac | |-types.hpp For classes and methods I do not have a real answer, I would like to avoid 2 things: the explicit use of pointers the use of templates I want to avoid the use of the pointers because they can make the code less readable for someone and I want to avoid templates just because if I write them, I can't separate the interface from the definition. What is the best option to hide the use of the pointers? I would also like some words about macros and how to implement some OS-specifics calls and definitions.

    Read the article

  • Managing JS and CSS for a static HTML web application

    - by Josh Kelley
    I'm working on a smallish web application that uses a little bit of static HTML and relies on JavaScript to load the application data as JSON and dynamically create the web page elements from that. First question: Is this a fundamentally bad idea? I'm unclear on how many web sites and web applications completely dispense with server-side generation of HTML. (There are obvious disadvantages of JS-only web apps in the areas of graceful degradation / progressive enhancement and being search engine friendly, but I don't believe that these are an issue for this particular app.) Second question: What's the best way to manage the static HTML, JS, and CSS? For my "development build," I'd like non-minified third-party code, multiple JS and CSS files for easier organization, etc. For the "release build," everything should be minified, concatenated together, etc. If I was doing server-side generation of HTML, it'd be easy to have my web framework generate different development versus release HTML that includes multiple verbose versus concatenated minified code. But given that I'm only doing any static HTML, what's the best way to manage this? (I realize I could hack something together with ERB or Perl, but I'm wondering if there are any standard solutions.) In particular, since I'm not doing any server-side HTML generation, is there an easy, semi-standard way of setting up my static HTML so that it contains code like <script src="js/vendors/jquery.js"></script> <script src="js/class_a.js"></script> <script src="js/class_b.js"></script> <script src="js/main.js"></script> at development time and <script src="http://ajax.googleapis.com/ajax/libs/jquery/1.8.2/jquery.min.js"></script> <script src="js/entire_app.min.js"></script> for release?

    Read the article

  • Evolution of an Application: how to manage and improve core engine?

    - by Phil Carter
    The web application I work on has been live for a year now, but it's time for it to evolve and one of the ways in which it is evolving is into a multi-brand application - in this case several different companies using the application, different templates/content and some slight business logic changes between them. The problem I'm facing is implementing a best practice across the site where there are differences in business logic for each brand. These will mostly be very superficial, using a an alternative mailing list provider or capturing some extra data in a form. I don't want to have if(brand === x) { ... } else { ... } all over the site especially as most of what needs to be changed can be handled with extending the existing class. I've thought of several methods that could be used to instantiate the correct class, but I'm just not sure which is going to be best especially as some seem to lead to duplication of more code than should be necessary. Here's what I've considered: 1) Use a Static Loader similar to Zend_Loader which can take the class being requested, and has knowledge of the Brand and can then return the correct object. $class = App_Loader::getObject('User', $brand); 2) Factory classes. We use these in the application already for Products but we could utilise them here also to provide a transparent interface to the class. 3) Routing the page request to a specific brand controller. This however seems like it would duplicate a lot of code/logic. Is there a pattern or something else I should be considering to solve this problem? 4) How to manage a growing project that has multiple custom instances in production? Update This is a PHP application so the decisions on which class to load are made per request. There could be upwards of 100+ different 'brands' running.

    Read the article

  • Entity Framework with large systems - how to divide models?

    - by jkohlhepp
    I'm working with a SQL Server database with 1000+ tables, another few hundred views, and several thousand stored procedures. We are looking to start using Entity Framework for our newer projects, and we are working on our strategy for doing so. The thing I'm hung up on is how best to split the tables into different models (EDMX or DbContext if we go code first). I can think of a few strategies right off the bat: Split by schema We have our tables split across probably a dozen schemas. We could do one model per schema. This isn't perfect, though, because dbo still ends up being very large, with 500+ tables / views. Another problem is that certain units of work will end up having to do transactions that span multiple models, which adds to complexity, although I assume EF makes this fairly straightforward. Split by intent Instead of worrying about schemas, split the models by intent. So we'll have different models for each application, or project, or module, or screen, depending on how granular we want to get. The problem I see with this is that there are certain tables that inevitably have to be used in every case, such as User or AuditHistory. Do we add those to every model (violates DRY I think), or are those in a separate model that is used by every project? Don't split at all - one giant model This is obviously simple from a development perspective but from my research and my intuition this seems like it could perform terribly, both at design time, compile time, and possibly run time. What is the best practice for using EF against such a large database? Specifically what strategies do people use in designing models against this volume of DB objects? Are there options that I'm not thinking of that work better than what I have above? Also, is this a problem in other ORMs such as NHibernate? If so have they come up with any better solutions than EF?

    Read the article

< Previous Page | 353 354 355 356 357 358 359 360 361 362 363 364  | Next Page >