Search Results

Search found 60391 results on 2416 pages for 'data generation'.

Page 892/2416 | < Previous Page | 888 889 890 891 892 893 894 895 896 897 898 899  | Next Page >

  • How to measure code quality? [closed]

    - by Lo Wai Lun
    Is there a methodology or any objective standard to determine whether the code of the project is well-written? How to measure in a structural and scientific manner to access the quality of the code? Many people say code review is important and always do encapsulation and data abstraction to ensure the quality. How can we determine the quality? Can a structural, organised software design diagrams drawn implies good quality of code ? If we type the code with good cautions of encapsulation and data abstraction, why review anyway?

    Read the article

  • SQL Server 2012 content on Channel 9

    - by jamiet
    A mountain of SQL Server 2012 video content featuring Greg Low, Jonathan Kehayias, Joe Sack and Roger Doherty has just been released on Channel 9. Channel 9 has great support for tags and RSS feeds so if you want to automatically download all of that content simply you can add the following RSS feed: http://channel9.msdn.com/Tags/sql+server+2012/RSS to your podcast reader of choice and have fun learning about all the new features in SQL Server 2012 such as: AlwaysOn Power View SSDT SSRS Data Alerts SSAS Tabular Modelling DAX Improvements MDS improvements SSIS improvements DQS StreamInsight improvements Data-Tier Apps (DACs) LocalDB FileTable Spatial improvements T-SQL paging Distributed Replay XEvents improvements ADO.Net Code-first T-SQL improvements Server roles Partitioning improvements ColumnStore Whew, quite a list! @jamiet

    Read the article

  • CodePlex Daily Summary for Friday, June 11, 2010

    CodePlex Daily Summary for Friday, June 11, 2010New ProjectsBIxPress Community Edition: SSIS Toolset BIDS Addin,Audit,Notify,Deploy,Template: BI xPress is a BIDS Addin/standalone application for SQL Developer/DBA. This tool has many features including Auditing, Notification, Deployment, P...C# Shell (cash): Cash is a command-line interpreter (shell), written in C#. It is part of an project to produce tools which replace the traditional GNU/Linux user-l...Chernarus Life Revivved: Chernarus life revivved for arma 2 multiplayerDKAL: DKAL is a distributed authorization policy language. This project contains an engine for running DKAL policies. It is implemented primarly in F#.ImageResizer for Albulle: Application permettant de déployer le contenu pour une la galerie photos php Albulle. L'objectif est de faciliter et automatiser le redimensionnem...Intraweb Active Directory Authentication Demo: A simple demo created using Delphi 7 and Intraweb 9.0.42 with the Active Directory Helper interface to authenticate a user to Active Directory. ...Lokad CQRS - build scalable web sites and enterprise solutions on Windows Azure: Lokad CQRS helps to build scalable cloud applications for Windows Azure. It provides time-proven guidance and .NET Application Blocks to help arc...LRS: Write a concise, reader-focused summary. Write a concise, reader-focused summary. Write a concise, reader-focused summary. ManagementPeople99: ...MEDILIG - MEDICAL LIFE GUARD: Cross-platform EHR/EMR software for the design, implementation and use of autonomous, open database models for multilingual clinical data managemen...NETris - ASP .NET, AJAX, Web Service based Tetris Game: Tetris game with business logic provided via Web Services.Peace Through Force: Peace Through Force is a 2D side scrolling tactical shooter developed in XNA Game Studio 3.1 using C#. The game is targeted to be made available o...Powwa: Util to show battery status and cpu speed on laptop. Uses jWMI's for interfacing with Windows' WMI.Resource Management System: Resource Management system for Internal purpose using WPF,WCF and SilverlighttChat - ASP.NET, Ajax & Web Service based Chat Room: Simple chat room application. Technology: ASP .NET, Ajax, Web Services and MS SQL Server Database. Uses ASP .NET authentication mechanism.Test Project (ignore): This is used to demonstrate CodePlex at meetings. Please ignore this project.Transform Config: Transfom Config lets you use the new configuration transformation feature in Visual Studio 2010 without performing a publish on a web application p...unsocialcity: trying to make a game for facebook called <projectname> - just seemed like a fun idea http://apps.facebook.com/unsocialcityVorbisPlayer: VorbisPlayer is the audio user control for Silverlight games. It plays loop-sets seamless, it solves the short sound problem, and it can play sound...New ReleasesA Guide to Parallel Programming: Drop 5 - Guide Preface, Chapters 1 - 7, and code: This is Drop 5 with Guide Preface, Chapters 1 - 7, Appendix B, Glossary, and References, and the accompanying code samples. This drop requires Visu...CC.Hearts Screen Saver: CC.Hearts Screen Saver 1.0.10.610: The third release of CC.Hearts Screen Saver. Key features are: Further performance enhancements Keyboard commands (Press ? for help) Help Popu...Fiddler Delayed Responses Extension: v1 beta: UI improvement . drag and drop . session markers . icons . layout Algorithm review . Performance issues Thanks to Eric Lawrence for his ideas.FontViewer 2010: FontViewer 2010 (Codename Eraser): This is the installer for the development version ("Eraser") of FontViewer 2010. Because many of the features are under development, functionality...Genuilder: Genuilder 1.2: First release of Genuilder.Extensibility.ImageResizer for Albulle: ImageResizer1.0-bin: Première version. Permet de déployer des images dans un système de fichier (local).imdb movie downloader: myImdb 0.9.5: myImdb 0.9.5KooBoo Image Gallery: Beta 4: This new version has a new example using s3slider script http://www.serie3.info/s3slider/demonstration.html thanks to mshimao Now there are two pl...LogikBug's IoC Container: LogikBug's IoC Container v1.1.1: In this release: I extended the Extensibility namespace. Fixed a few minor issues with the Extensibility namespace.LRS: jlrs: asdfaLRS: jlrs src: jlrs srcMiniTwitter: 1.14: MiniTwitter 1.14 更新内容 修正 リストのインポートをキャンセルしてもタイムラインの名前がリストの名前になるバグを修正 OAuth の承認を取り消した後、再ログインできないバグを修正 インポートするリストを選択せずにインポートをクリックすると落ちるバグを修正 追加 ...NETris - ASP .NET, AJAX, Web Service based Tetris Game: NETris - Source Code and Documentation: Fully functional prototype. Please note that documentation is written in the form of a report as the project was an assignment at Coventry University.Object/Relational Mapper & Code Generator in Net 2.0 for Relational & XML Schema: 2.10: Minor release, incremental changes to sample website and UI templates.Opalis Community Releases: Integration Pack for Standard OIS Logging: The Integration Pack for Standard OIS Logging provides extended Policy Logging functions to OIS and MSSQL. This Integration Pack adds the followin...PicassoCms: 0.6: More intuitive UI, new controlsPowerAuras: PowerAuras-3.0.0K-beta2: New Auras: Item Name Equipment Slot Tracking Changes from beta1 5 new aura textures Fixed Tracking bug Added graphical equipment slot sele...PowerPivot Sample Data: PowerPivot for Excel Tutorial Sample Data-v.2: The PowerPivot Tutorial Sample Data-Version 2 download includes a variety of data sources that you can use to complete the tutorial in the PowerPiv...Powwa: First build: Unpack & built with NetBean 6.8Quick Performance Monitor: Version 1.4: Added functionality to add and remove performance counters at run time. Also added saving and loading to file so sets of performance counters can b...Resonance: TrainNode Client Library: Libraries to access the TrainNode ServiceSharePoint 2010 Taxonomy Import Utility: TaxonomyBuilder Version 1.0.2: New Features Added support for additional term labels per term Added support for Term Set Owners Added support for Term Set stakeholders Upda...Silverlight for Umbraco Media Objects (SUMO): Community Tech Preview: The CTP for SUMO is now live, feedback appreciated!Silverlight Reporting: Initial Release: This is the first release of the code. It includes the source code from Pete's blog post article on Silverlight reporting.Simple.NET: Simple.Mocking 1.0.0.7: Initial version of a new mocking framework for .NET Revision 1: Expect.AnyInocationOn<T>(T target) changed to Expect.AnyInocationOn(object target...Smart Voice: Smart Voice 0.2.2: Changelog: Fixed more bugs Added a readme into the archiveSoulHackers Demon Unite(Chinese version): WPFClient pre alpha 2: pre alpha 2, need your feedbackSquiggle - A Free open source Lan Messenger: Squiggle 1.5: File Transfer capability added (With drag/drop support) Message text box maintains history of last 10 messages and you can retrieve them by CTRL+U...SSIS Expression Editor & Tester: Expression Editor and Tester v1.0.1.0: Minor updated release of expression editor tool and editor control. Download and extract the files to get started, no install required. Changes Si...StreamInsight Samples: GregLow HighwayMonitor Samples: Initial Upload of GregLow HighwayMonitor StreamInsight samples. These samples are used in the upcoming free eClinic for StreamInsight and have been...tChat - ASP.NET, Ajax & Web Service based Chat Room: tChat Source Code and Documentation: Functional prototype. T-SQL scripts can be found in the SQL folder. Please note that the documentation is written in a format of a report for a "du...Transform Config: Initial Release: This is the initial release of the project. It's all been thrown together quickly so it's lacking error handling etc, but it's still fully function...UrzaGatherer: UrzaGatherer v2.0.2: Integrate the support of SQL Server Compact Edition 3.5 SP2 for a better portability.Value Injecter: map anything to anything anyway you might imagine: ValueInjecter 1.9: Features map anything to anything flattening unflattening includes sample projects for: asp.net mvc asp.net web-forms win-formsVCC: Latest build, v2.1.30610.0: Automatic drop of latest buildVisual Studio DSite: Picture SlideShow Viewer (Visual C++ 2008): A picture slidershow viewer.VorbisPlayer: VorbisPlayer: The first release of the Silverlight VorbisPlayer, including source code and example files.WCF 4 Templates for Visual Studio 2010: AnonymousOverHttps Template: Produces a WCF service application configured for anonymous calls over HTTPS/SSL. Supplies a BasicHttpBinding default configured for Transport secu...Most Popular ProjectsHaoRan_TokyoTyrantClient.NET Transactional File ManagerSOLID by exampleMemetic NPC Behavior ToolkitSharpotify - Spotify .Net LibraryWCF 4 Templates for Visual Studio 2010SFTP Component for .NET CSharp, VB.NET, and ASP.NETUltimate FTP Component for .NET C#, VB.NET and ASP.NETAnurag Pallaprolu's Code RepositoryBigfootMVCMost Active ProjectsCommunity Forums NNTP bridgejQuery Library for SharePoint Web ServicesRhyduino - Arduino and Managed Codepatterns & practices – Enterprise LibraryNB_Store - Free DotNetNuke Ecommerce Catalog ModuleCassandraemonBlogEngine.NETMediaCoder.NETAndrew's XNA HelpersStyleCop

    Read the article

  • SQLAuthority News – SQL Server 2012 Upgrade Technical Guide – A Comprehensive Whitepaper – (454 pages – 9 MB)

    - by pinaldave
    Microsoft has just released SQL Server 2012 Upgrade Technical Guide. This guide is very comprehensive and covers the subject of upgrade in-depth. This is indeed a helpful detailed white paper. Even writing a summary of this white paper would take over 100 pages. This further proves that SQL Server 2012 is quite an important release from Microsoft. This white paper discusses how to upgrade from SQL Server 2008/R2 to SQL Server 2012. I love how it starts with the most interesting and basic discussion of upgrade strategies: 1) In-place upgrades, 2) Side by side upgrade, 3) One-server, and 4) Two-server. This whitepaper is not just pure theory but is also an excellent source for some tips and tricks. Here is an example of a good tip from the paper: “If you want to upgrade just one database from a legacy instance of SQL Server and not upgrade the other databases on the server, use the side-by-side upgrade method instead of the in-place method.” There are so many trivia, tips and tricks that make creating the list seems humanly impossible given a short period of time. My friend Vinod Kumar, an SQL Server expert, wrote a very interesting article on SQL Server 2012 Upgrade before. In that article, Vinod addressed the most interesting and practical questions related to upgrades. He started with the fundamentals of how to start backup before upgrade and ended with fail-safe strategies after the upgrade is over. He covered end-to-end concepts in his blog posts in simple words in extremely precise statements. A successful upgrade uses a cycle of: planning, document process, testing, refine process, testing, planning upgrade window, execution, verifying of upgrade and opening for business. If you are at Vinod’s blog post, I suggest you go all the way down and collect the gold mine of most important links. I have bookmarked the blog by blogging about it and I suggest that you bookmark it as well with the way you prefer. Vinod Kumar’s blog post on SQL Server 2012 Upgrade Technical Guide SQL Server 2012 Upgrade Technical Guide is a detailed resource that’s also available online for free. Each chapter was carefully crafted and explained in detail. Here is a quick list of the chapters included in the whitepaper. Before downloading the guide, beware of its size of 9 MB and 454 pages. Here’s the list of chapters: Chapter 1: Upgrade Planning and Deployment Chapter 2: Management Tools Chapter 3: Relational Databases Chapter 4: High Availability Chapter 5: Database Security Chapter 6: Full-Text Search Chapter 7: Service Broker Chapter 8: SQL Server Express Chapter 9: SQL Server Data Tools Chapter 10: Transact-SQL Queries Chapter 11: Spatial Data Chapter 12: XML and XQuery Chapter 13: CLR Chapter 14: SQL Server Management Objects Chapter 15: Business Intelligence Tools Chapter 16: Analysis Services Chapter 17: Integration Services Chapter 18: Reporting Services Chapter 19: Data Mining Chapter 20: Other Microsoft Applications and Platforms Appendix 1: Version and Edition Upgrade Paths Appendix 2: SQL Server 2012: Upgrade Planning Checklist Download SQL Server 2012 Upgrade Technical Guide [454 pages and 9 MB] Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Database, DBA, PostADay, SQL, SQL Authority, SQL Documentation, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, SQL White Papers, SQLAuthority News, SQLServer, T SQL, Technology

    Read the article

  • VLOOKUP in Excel, part 2: Using VLOOKUP without a database

    - by Mark Virtue
    In a recent article, we introduced the Excel function called VLOOKUP and explained how it could be used to retrieve information from a database into a cell in a local worksheet.  In that article we mentioned that there were two uses for VLOOKUP, and only one of them dealt with querying databases.  In this article, the second and final in the VLOOKUP series, we examine this other, lesser known use for the VLOOKUP function. If you haven’t already done so, please read the first VLOOKUP article – this article will assume that many of the concepts explained in that article are already known to the reader. When working with databases, VLOOKUP is passed a “unique identifier” that serves to identify which data record we wish to find in the database (e.g. a product code or customer ID).  This unique identifier must exist in the database, otherwise VLOOKUP returns us an error.  In this article, we will examine a way of using VLOOKUP where the identifier doesn’t need to exist in the database at all.  It’s almost as if VLOOKUP can adopt a “near enough is good enough” approach to returning the data we’re looking for.  In certain circumstances, this is exactly what we need. We will illustrate this article with a real-world example – that of calculating the commissions that are generated on a set of sales figures.  We will start with a very simple scenario, and then progressively make it more complex, until the only rational solution to the problem is to use VLOOKUP.  The initial scenario in our fictitious company works like this:  If a salesperson creates more than $30,000 worth of sales in a given year, the commission they earn on those sales is 30%.  Otherwise their commission is only 20%.  So far this is a pretty simple worksheet: To use this worksheet, the salesperson enters their sales figures in cell B1, and the formula in cell B2 calculates the correct commission rate they are entitled to receive, which is used in cell B3 to calculate the total commission that the salesperson is owed (which is a simple multiplication of B1 and B2). The cell B2 contains the only interesting part of this worksheet – the formula for deciding which commission rate to use: the one below the threshold of $30,000, or the one above the threshold.  This formula makes use of the Excel function called IF.  For those readers that are not familiar with IF, it works like this: IF(condition,value if true,value if false) Where the condition is an expression that evaluates to either true or false.  In the example above, the condition is the expression B1<B5, which can be read as “Is B1 less than B5?”, or, put another way, “Are the total sales less than the threshold”.  If the answer to this question is “yes” (true), then we use the value if true parameter of the function, namely B6 in this case – the commission rate if the sales total was below the threshold.  If the answer to the question is “no” (false), then we use the value if false parameter of the function, namely B7 in this case – the commission rate if the sales total was above the threshold. As you can see, using a sales total of $20,000 gives us a commission rate of 20% in cell B2.  If we enter a value of $40,000, we get a different commission rate: So our spreadsheet is working. Let’s make it more complex.  Let’s introduce a second threshold:  If the salesperson earns more than $40,000, then their commission rate increases to 40%: Easy enough to understand in the real world, but in cell B2 our formula is getting more complex.  If you look closely at the formula, you’ll see that the third parameter of the original IF function (the value if false) is now an entire IF function in its own right.  This is called a nested function (a function within a function).  It’s perfectly valid in Excel (it even works!), but it’s harder to read and understand. We’re not going to go into the nuts and bolts of how and why this works, nor will we examine the nuances of nested functions.  This is a tutorial on VLOOKUP, not on Excel in general. Anyway, it gets worse!  What about when we decide that if they earn more than $50,000 then they’re entitled to 50% commission, and if they earn more than $60,000 then they’re entitled to 60% commission? Now the formula in cell B2, while correct, has become virtually unreadable.  No-one should have to write formulae where the functions are nested four levels deep!  Surely there must be a simpler way? There certainly is.  VLOOKUP to the rescue! Let’s redesign the worksheet a bit.  We’ll keep all the same figures, but organize it in a new way, a more tabular way: Take a moment and verify for yourself that the new Rate Table works exactly the same as the series of thresholds above. Conceptually, what we’re about to do is use VLOOKUP to look up the salesperson’s sales total (from B1) in the rate table and return to us the corresponding commission rate.  Note that the salesperson may have indeed created sales that are not one of the five values in the rate table ($0, $30,000, $40,000, $50,000 or $60,000).  They may have created sales of $34,988.  It’s important to note that $34,988 does not appear in the rate table.  Let’s see if VLOOKUP can solve our problem anyway… We select cell B2 (the location we want to put our formula), and then insert the VLOOKUP function from the Formulas tab: The Function Arguments box for VLOOKUP appears.  We fill in the arguments (parameters) one by one, starting with the Lookup_value, which is, in this case, the sales total from cell B1.  We place the cursor in the Lookup_value field and then click once on cell B1: Next we need to specify to VLOOKUP what table to lookup this data in.  In this example, it’s the rate table, of course.  We place the cursor in the Table_array field, and then highlight the entire rate table – excluding the headings: Next we must specify which column in the table contains the information we want our formula to return to us.  In this case we want the commission rate, which is found in the second column in the table, so we therefore enter a 2 into the Col_index_num field: Finally we enter a value in the Range_lookup field. Important:  It is the use of this field that differentiates the two ways of using VLOOKUP.  To use VLOOKUP with a database, this final parameter, Range_lookup, must always be set to FALSE, but with this other use of VLOOKUP, we must either leave it blank or enter a value of TRUE.  When using VLOOKUP, it is vital that you make the correct choice for this final parameter. To be explicit, we will enter a value of true in the Range_lookup field.  It would also be fine to leave it blank, as this is the default value: We have completed all the parameters.  We now click the OK button, and Excel builds our VLOOKUP formula for us: If we experiment with a few different sales total amounts, we can satisfy ourselves that the formula is working. Conclusion In the “database” version of VLOOKUP, where the Range_lookup parameter is FALSE, the value passed in the first parameter (Lookup_value) must be present in the database.  In other words, we’re looking for an exact match. But in this other use of VLOOKUP, we are not necessarily looking for an exact match.  In this case, “near enough is good enough”.  But what do we mean by “near enough”?  Let’s use an example:  When searching for a commission rate on a sales total of $34,988, our VLOOKUP formula will return us a value of 30%, which is the correct answer.  Why did it choose the row in the table containing 30% ?  What, in fact, does “near enough” mean in this case?  Let’s be precise: When Range_lookup is set to TRUE (or omitted), VLOOKUP will look in column 1 and match the highest value that is not greater than the Lookup_value parameter. It’s also important to note that for this system to work, the table must be sorted in ascending order on column 1! If you would like to practice with VLOOKUP, the sample file illustrated in this article can be downloaded from here. Similar Articles Productive Geek Tips Using VLOOKUP in ExcelImport Microsoft Access Data Into ExcelImport an Access Database into ExcelCopy a Group of Cells in Excel 2007 to the Clipboard as an ImageShare Access Data with Excel in Office 2010 TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 Quickly Schedule Meetings With NeedtoMeet Share Flickr Photos On Facebook Automatically Are You Blocked On Gtalk? Find out Discover Latest Android Apps On AppBrain The Ultimate Guide For YouTube Lovers Will it Blend? iPad Edition

    Read the article

  • The Modern Marketer’s Guide to Connected Customer Journeys

    - by Richard Lefebvre
    By Amanda Batista on Thursday, August 14, 2014 in Marketing Efficiency Organizations are striving to deliver consistent experiences but very few feel they are there yet. It’s a simple consideration for marketers, really. Not only does industry data continue to support that customers demand personalized experiences when engaging with brands, but if you think about your own consumer driven shopping experiences, you, too, expect that stellar experience at every touch point. And when you don’t get it, that brand has potentially alienated the experience, as well as their shot at engaging with you in more meaningful ways. Oracle Marketing Cloud partnered with marketingfinder.co.uk to conduct a survey exploring how marketers are adapting to this new age of the customer and the challenges they face. Less than half (40%) of marketers in the study were able to track the customer journey across channels. These findings, as well as other data points showcasing marketers’ challenges, are explored in our latest eBook, “The Modern Marketer's Guide to Connected Customer Journeys.” Read the entire article and order your copy of the full report here

    Read the article

  • Ubuntu tweak and Mozilla (firefox and thunderbird) cache

    - by Avatar Parto
    I usually use Ubuntu tweak to do cleanup jobs on my PC. This includes apt and program cached data and old kernels. This goes alright for most programs except Mozilla based application - Firefox and Thunderbird. Ubuntu tweak doesn't seem to know where their cache folders are and always returns 'zero packages can be cleaned' even when the cache folder is full. Check screenshot below: I am looking for a way to clean up ALL my cache data and unneeded packages at one point. If someone knows how to change the ubuntu tweak cache folders for Firefox and Thunderbird, that would be perfect. I tried bleachbit last but it crashed my PC to a point I had to re-install Ubuntu. I am using Ubuntu tweak 0.8.6 on Ubuntu 13.04. Thanxs.

    Read the article

  • A Plea for Plain English

    - by Tony Davis
    The English language has, within a lifetime, emerged as the ubiquitous 'international language' of scientific, political and technical communication. On the one hand, learning a single, common language, International English, has made it much easier to participate in and adopt new technologies; on the other hand it must be exasperating to have to use English at international conferences, or on community sites, when your own language has a long tradition of scientific and technical usage. It is also hard to master the subtleties of using a foreign language to explain advanced ideas. This requires English speakers to be more considerate in their writing. Even if you’re used to speaking English, you may be brought up short by this sort of verbiage… "Business Intelligence delivering actionable insights is becoming more critical in the enterprise, and these insights require large data volumes for trending and forecasting" It takes some imagination to appreciate the added hassle in working out what it means, when English is a language you only use at work. Try, just to get a vague feel for it, using Google Translate to translate it from English to Chinese and back again. "Providing actionable business intelligence point of view is becoming more and more and more business critical, and requires that these insights and projected trends in large amounts of data" Not easy eh? If you normally use a different language, you will need to pause for thought before finally working out that it really means … "Every Business Intelligence solution must be able to help companies to make decisions. In order to detect current trends, and accurately predict future ones, we need to analyze large volumes of data" Surely, it is simple politeness for English speakers to stop peppering their writing with a twisted vocabulary that renders it inaccessible to everyone else. It isn’t just the problem of writers who use long words to give added dignity to their prose. It is the use of Colloquial English. This changes and evolves at a dizzying rate, adding new terms and idioms almost daily; it is almost a new and separate language. By contrast, ‘International English', is gradually evolving separately, at its own, more sedate, pace. As such, all native English speakers need to make an effort to learn, and use it, switching from casual colloquial patter into a simpler form of communication that can be widely understood by different cultures, even if it gives you less credibility on the street. Simple-Talk is based, at least in part, on the idea that technical articles can be written simply and clearly in a form of English that can be easily understood internationally, and that they can be written, with a little editorial help, by anyone, and read by anyone, regardless of their native language. Cheers, Tony.

    Read the article

  • Find the occurrence of word/character in SQL column with wildcard character - PATINDEX

    - by Vipin
    CharIndex and PatIndex both can be used to determine the presence of character or string within sql column data. Both returns the starting position of the first occurrence of the character/word within expression. However, one major difference between CharIndex and PatIndex is that later allows the use of wild card characters while searching for character or word within column data. Also, Patindex is useful for searching within Text datatype. Allowed wild card characters are % and _ . " % "  - use it for any number of characters " _ "  - use it for a single character. Syntax PATINDEX('%pattern%', string_expression) Note - it's mandatory to include pattern within %% characters. returns starting position of occurrence of pattern, if found. returns 0, if not found returns NULL , if either pattern or string_expression is null. Example SELECT fldname FROM tblUsers WHERE PatIndex('%v_pin%', fldname) > 0

    Read the article

  • FREE Windows Azure evening in London on April 15th including FREE access to Windows Azure

    - by Eric Nelson
    [Did I overdo the use of FREE in the title? :-)] April 12th to 16th is Microsoft Tech Days – 5 days of sessions on Visual Studio 2010 through to Windows 7 Phone Series. Many of these days are now full (Tip - Thursday still has room if rich client applications is your thing) but the good news is the development community in the UK has pulled together an awesome series of “fringe events” during April in London and elsewhere in the UK. There are sessions on Silverlight, SQL Server 2008 R2, Sharepoint 2010 and … the Windows Azure Platform. The UK AzureNET user group is planning to put on a great evening and AzureNET will be giving away hundreds of free subscriptions to the Windows Azure Platform during the evening. The subscription includes up to 20 Windows Azure Compute nodes and 3 SQL Azure databases for you to play with over the 2 weeks following the event. This is a great opportunity to really explore the Windows Azure Platform in detail – without a credit card! Register now! (and you might also want to join the UK Fans of Azure Community while I have your attention) FYI The Thursday day time event includes an introduction to Windows Azure session delivered by my colleague David – which would be an ideal session to attend if you are new to Azure and want to get the most out of the evening session. 7:00pm: See the difference: How Windows Azure helped build a new way of giving Simon Evans and James Broome (@broomej) They will cover the business context for Azure and then go into patterns used and lessons learnt from the project....as well as showing off the app of course! 8:00pm: UK AzureNET update 8:15pm: NoSQL databases or: How I learned to love the hash table Mark Rendle (@markrendle) In this session Mark will look at how Azure Table Service works and how to use it. We’ll look briefly at the high-level Data Services SDK, talk about its limitations, and then quickly move on to the REST API and how to use it to improve performance and reduce costs. We’ll make-up some pretend real-world problems and solve them in new and interesting ways. We’ll denormalise data (for fun and profit). We’ll talk about how certain social networking sites can deal with huge volumes of data so quickly, and why it sometimes goes wrong. Check out the complete list of fringe events which covers the UK fairly well:

    Read the article

  • Oracle SQL Developer v3.2.1 Now Available

    - by thatjeffsmith
    Oracle SQL Developer version 3.2.1 is now available. I recommend that everyone now upgrade to this release. It features more than 200 bug fixes, tweaks, and polish applied to the 3.2 edition. The high profile bug fixes submitted by customers and users on our forums are listed in all their glory for your review. I want to highlight a few of the changes though, as I recognize many of you lack the time and/or patience to ‘read the docs.’ That would include me, which is why I enjoy writing these kinds of blog posts. I’m lazy – just like you! No more artificial line breaks between CREATE OR REPLACE and your PL/SQL In versions 3.2 and older, when you pull up your stored procedural objects in our editor, you would see a line break inserted between the CREATE OR REPLACE and then the body of your code. In version 3.2.1, we have removed the line break. 3.1 3.2.1 Trivia Did You Know? The database doesn’t store the ‘CREATE’ or ‘CREATE OR REPLACE’ bit of your PL/SQL code in the database. If we look at the USER_SOURCE view, we can see that the code begins with the object name. So the CREATE OR REPLACE bit is ‘artificial’ The intent is to give you the code necessary to recreate your object – and have it ‘compile’ into the database. We pretty much HAVE to add the ‘CREATE OR REPLACE.’ From now on it will appear inline with the first line of your code. Exporting Tables & Views When exporting data from your tables or views, previous versions of SQL Developer presented a 3 step wizard. It allows you to choose your columns and apply data filters for what is exported. This was kind of redundant. The grids already allowed you to select your columns and apply filters. Wouldn’t it be more intuitive AND efficient to just make the grids behave in a What You See Is What You Get (WYSIWYG) fashion? In version 3.2.1, that is exactly what will happen. The wizard now only has two steps and the grid will export the data and columns as defined in the visible grid. Let the grid properties define what is actually exported! And here is what is pasted into my worksheet: "BREWERY"|"CITY" "3 Brewers Restaurant Micro-Brewery"|"Toronto" "Amsterdam Brewing Co."|"Toronto" "Ball Brewing Company Ltd."|"Toronto" "Big Ram Brewing Company"|"Toronto" "Black Creek Historic Brewery"|"Toronto" "Black Oak Brewing"|"Toronto" "C'est What?"|"Toronto" "Cool Beer Brewing Company"|"Toronto" "Denison's Brewing"|"Toronto" "Duggan's Brewery"|"Toronto" "Feathers"|"Toronto" "Fermentations! - Danforth"|"Toronto" "Fermentations! - Mount Pleasant"|"Toronto" "Granite Brewery & Restaurant"|"Toronto" "Labatt's Breweries of Canada"|"Toronto" "Mill Street Brew Pub"|"Toronto" "Mill Street Brewery"|"Toronto" "Molson Breweries of Canada"|"Toronto" "Molson Brewery at Air Canada Centre"|"Toronto" "Pioneer Brewery Ltd."|"Toronto" "Post-Production Bistro"|"Toronto" "Rotterdam Brewing"|"Toronto" "Steam Whistle Brewing"|"Toronto" "Strand Brasserie"|"Toronto" "Upper Canada Brewing"|"Toronto" JUST what I wanted And One Last Thing Speaking of export, sometimes I want to send data to Excel. And sometimes I want to send multiple objects to Excel – to a single Excel file that is. In version 3.2.1 you can now do that. Let’s export the bulk of the HR schema to Excel, with each table going to it’s own workbook in the same worksheet. Select many tables, put them in in a single Excel worksheet If you try this in previous versions of SQL Developer it will just write the first table to the Excel file. This is one of the bugs we addressed in v3.2.1. Here is what the output Excel file looks like now: Many tables - Many workbooks in an Excel Worksheet I have a sneaky suspicion that this will be a frequently used feature going forward. Excel seems to be the cornerstone of many of our popular features. Imagine that!

    Read the article

  • Implicit Intent is not working [migrated]

    - by Sayem Siam
    I have a activity class named Notelist.In the Notelist class i have tried to insert a new note.For that i have used implicit Intent.But when i click to insert a new note it gives a run time error. public boolean onOptionsItemSelected(MenuItem item) { switch (item.getItemId()) { case R.id.menu_add: Log.d("sayem", "in case of fd"); Toast.makeText(this, "in the", Toast.LENGTH_LONG).show(); startActivity(new Intent(Intent.ACTION_INSERT, getIntent() .getData())); break; default: throw new IllegalArgumentException("not matched"); } return true; } And i have NoteEditor activity clas to Insert a new note. And here is my Androidmanifesto.xml file. <uses-sdk android:minSdkVersion="14" /> <application android:icon="@drawable/ic_launcher" android:label="@string/app_name" > <activity android:label="@string/app_name" android:name=".NotesList" > <intent-filter > <action android:name="android.intent.action.MAIN" /> <category android:name="android.intent.category.LAUNCHER" /> </intent-filter> <intent-filter> <action android:name="android.intent.action.VIEW" /> <action android:name="android.intent.action.EDIT" /> <action android:name="android.intent.action.PICK" /> <category android:name="android.intent.category.DEFAULT" /> <data android:mimeType="vnd.android.cursor.dir/vnd.google.note" /> </intent-filter> <intent-filter > <action android:name="android.intent.action.GET_CONTENT" /> <category android:name="android.intent.category.DEFAULT" /> <data android:mimeType="vnd.android.cursor.item/vnd.google.note" /> </intent-filter> </activity> <activity android:name="NoteEditor" > <intent-filter> <action android:name="NoteEditor"></action> <action android:name="android.intent.action.INSERT" /> <action android:name="android.intent.action.PASTE" /> <category android:name="android.intent.category.DEFAULT" /> <data android:mimeType="vnd.android.cursor.dir/vnd.google.note" /> </intent-filter> </activity> </application>

    Read the article

  • Why does max-width behave counter intuitively on columns in a table? [migrated]

    - by Nate
    Basically, I have a stretchy table, I want my label column to be fixed width and my data column to be dynamically sized. My inclination would be to set the max-width via CSS on my label column. However, this has the opposite effect. I've created a jsfiddle that replicates this. (Re-size the window to see the left column dynamically sized and the right column fixed size) On my own site, I see the same behavior and it happens in IE and Chrome. If I switch it, and set max-width on the data column, everything behaves as I want, but it feels backwards to me. Am I doing something wrong here?

    Read the article

  • IBM DB2 and the “'DbProviderFactories' section can only appear once per config” error

    - by Davide Mauri
    IBM doesn’t like MS. That’s a fact. And that’s why you can get your machine.config file (!!!) corrupted if you try to install IBM DB2 data providers on your server machine. If at some point, after having installed IBM DB2 data providers your SSIS packages or SSAS cubes or SSRS Reports starts to complain that 'DbProviderFactories' section can only appear once per config you may want to check into you machine.config, located in the %runtime install path%\Config http://msdn.microsoft.com/en-us/library/ms229697%28v=vs.71%29.aspx Almost surely you’ll find a IBM DB2 Provider into an additional DbProviderFactories section all alone. Poor guy. Remove the double DBProviderFactories entry, and merge everything inside only one section DBProviderFactories and after that everything will start to work again.

    Read the article

  • More on Visual Studio 11 from Scott Guthrie

    - by TATWORTH
    At http://weblogs.asp.net/scottgu/archive/2011/10/30/web-forms-model-binding-part-3-updating-and-validation-asp-net-4-5-series.aspx, Scott Guthrie talks about data binding is ASP.NET 4.5.There is a key statement "Because our GetProducts() method is returning an IQueryable<Product>, users can easily page and sort through the data within our GridView.  Only the 10 rows that are visible on any given page are returned from the database."Consider paging through a large dataset, this is going to give high performance with very little code as the database to IIS server traffic will be reduced.Can't code withoutThe best C# & VB.NET refactoring plugin for Visual Studio

    Read the article

  • ls -l freezes terminal locally and remotely

    - by Jakobud
    I've been reading other SF threads regarding ls not returning results or freezing and stalling terminal sessions and it appears they usually the fault of network problems. My problem however, occurs both over remote SSH sessions but also if I am physically at the server itself... I just installed CentOS 5.4 on one of our servers. I'm setting up some rdiff-backup scripts and when I downloaded librsync and untared it, thats when I started seeing some weird behavior with ls -l. wget http://sourceforge.net/projects/librsync/files/librsync/0.9.7/librsync-0.9.7.tar.gz/download /tmp cd /tmp tar -xzf librsync-0.9.7.tar.gz Simple enough. To view the files in this directory I did this: ls results: librsync-0.9.7 librsync-0.9.7.tar.gz Now, if I ls -l, my terminal freezes. I have to re-ssh in to keep going. After reading SF threads, I thought it was network related. So I was extremely surprised to go sit down at the server itself and see the exact same thing happen... So its obviously not a network issues. Even if I ls /tmp/librsync-0.9.7, my terminal freezes just the same... Next I did an strace and got this (warning: wall of text coming....): strace ls -l /tmp execve("/bin/ls", ["ls", "-l", "/tmp"], [/* 21 vars */]) = 0 brk(0) = 0x1c521000 mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x2b8582cc0000 uname({sys="Linux", node="massive.answeron.com", ...}) = 0 access("/etc/ld.so.preload", R_OK) = -1 ENOENT (No such file or directory) open("/etc/ld.so.cache", O_RDONLY) = 3 fstat(3, {st_mode=S_IFREG|0644, st_size=71746, ...}) = 0 mmap(NULL, 71746, PROT_READ, MAP_PRIVATE, 3, 0) = 0x2b8582cc1000 close(3) = 0 open("/lib64/librt.so.1", O_RDONLY) = 3 read(3, "\177ELF\2\1\1\0\0\0\0\0\0\0\0\0\3\0>\0\1\0\0\0 \"\200\2730\0\0\0"..., 832) = 832 fstat(3, {st_mode=S_IFREG|0755, st_size=53448, ...}) = 0 mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x2b8582cd3000 mmap(0x30bb800000, 2132936, PROT_READ|PROT_EXEC, MAP_PRIVATE|MAP_DENYWRITE, 3, 0) = 0x30bb800000 mprotect(0x30bb807000, 2097152, PROT_NONE) = 0 mmap(0x30bba07000, 8192, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_FIXED|MAP_DENYWRITE, 3, 0x7000) = 0x30bba07000 close(3) = 0 open("/lib64/libacl.so.1", O_RDONLY) = 3 read(3, "\177ELF\2\1\1\0\0\0\0\0\0\0\0\0\3\0>\0\1\0\0\0\0\31@\2740\0\0\0"..., 832) = 832 fstat(3, {st_mode=S_IFREG|0755, st_size=28008, ...}) = 0 mmap(0x30bc400000, 2120992, PROT_READ|PROT_EXEC, MAP_PRIVATE|MAP_DENYWRITE, 3, 0) = 0x30bc400000 mprotect(0x30bc406000, 2093056, PROT_NONE) = 0 mmap(0x30bc605000, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_FIXED|MAP_DENYWRITE, 3, 0x5000) = 0x30bc605000 close(3) = 0 open("/lib64/libselinux.so.1", O_RDONLY) = 3 read(3, "\177ELF\2\1\1\0\0\0\0\0\0\0\0\0\3\0>\0\1\0\0\0`E\300\2730\0\0\0"..., 832) = 832 fstat(3, {st_mode=S_IFREG|0755, st_size=95464, ...}) = 0 mmap(0x30bbc00000, 2192784, PROT_READ|PROT_EXEC, MAP_PRIVATE|MAP_DENYWRITE, 3, 0) = 0x30bbc00000 mprotect(0x30bbc15000, 2097152, PROT_NONE) = 0 mmap(0x30bbe15000, 8192, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_FIXED|MAP_DENYWRITE, 3, 0x15000) = 0x30bbe15000 mmap(0x30bbe17000, 1424, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_FIXED|MAP_ANONYMOUS, -1, 0) = 0x30bbe17000 close(3) = 0 open("/lib64/libc.so.6", O_RDONLY) = 3 read(3, "\177ELF\2\1\1\0\0\0\0\0\0\0\0\0\3\0>\0\1\0\0\0\220\332\201\2720\0\0\0"..., 832) = 832 fstat(3, {st_mode=S_IFREG|0755, st_size=1717800, ...}) = 0 mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x2b8582cd4000 mmap(0x30ba800000, 3498328, PROT_READ|PROT_EXEC, MAP_PRIVATE|MAP_DENYWRITE, 3, 0) = 0x30ba800000 mprotect(0x30ba94d000, 2097152, PROT_NONE) = 0 mmap(0x30bab4d000, 20480, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_FIXED|MAP_DENYWRITE, 3, 0x14d000) = 0x30bab4d000 mmap(0x30bab52000, 16728, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_FIXED|MAP_ANONYMOUS, -1, 0) = 0x30bab52000 close(3) = 0 open("/lib64/libpthread.so.0", O_RDONLY) = 3 read(3, "\177ELF\2\1\1\0\0\0\0\0\0\0\0\0\3\0>\0\1\0\0\0\220W\0\2730\0\0\0"..., 832) = 832 fstat(3, {st_mode=S_IFREG|0755, st_size=145824, ...}) = 0 mmap(0x30bb000000, 2204528, PROT_READ|PROT_EXEC, MAP_PRIVATE|MAP_DENYWRITE, 3, 0) = 0x30bb000000 mprotect(0x30bb016000, 2093056, PROT_NONE) = 0 mmap(0x30bb215000, 8192, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_FIXED|MAP_DENYWRITE, 3, 0x15000) = 0x30bb215000 mmap(0x30bb217000, 13168, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_FIXED|MAP_ANONYMOUS, -1, 0) = 0x30bb217000 close(3) = 0 open("/lib64/libattr.so.1", O_RDONLY) = 3 read(3, "\177ELF\2\1\1\0\0\0\0\0\0\0\0\0\3\0>\0\1\0\0\0\320\17\300\2750\0\0\0"..., 832) = 832 fstat(3, {st_mode=S_IFREG|0755, st_size=17888, ...}) = 0 mmap(0x30bdc00000, 2110728, PROT_READ|PROT_EXEC, MAP_PRIVATE|MAP_DENYWRITE, 3, 0) = 0x30bdc00000 mprotect(0x30bdc04000, 2093056, PROT_NONE) = 0 mmap(0x30bde03000, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_FIXED|MAP_DENYWRITE, 3, 0x3000) = 0x30bde03000 close(3) = 0 open("/lib64/libdl.so.2", O_RDONLY) = 3 read(3, "\177ELF\2\1\1\0\0\0\0\0\0\0\0\0\3\0>\0\1\0\0\0\20\16\300\2720\0\0\0"..., 832) = 832 fstat(3, {st_mode=S_IFREG|0755, st_size=23360, ...}) = 0 mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x2b8582cd5000 mmap(0x30bac00000, 2109696, PROT_READ|PROT_EXEC, MAP_PRIVATE|MAP_DENYWRITE, 3, 0) = 0x30bac00000 mprotect(0x30bac02000, 2097152, PROT_NONE) = 0 mmap(0x30bae02000, 8192, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_FIXED|MAP_DENYWRITE, 3, 0x2000) = 0x30bae02000 close(3) = 0 open("/lib64/libsepol.so.1", O_RDONLY) = 3 read(3, "\177ELF\2\1\1\0\0\0\0\0\0\0\0\0\3\0>\0\1\0\0\0\0=\0\2740\0\0\0"..., 832) = 832 fstat(3, {st_mode=S_IFREG|0755, st_size=247496, ...}) = 0 mmap(0x30bc000000, 2383136, PROT_READ|PROT_EXEC, MAP_PRIVATE|MAP_DENYWRITE, 3, 0) = 0x30bc000000 mprotect(0x30bc03b000, 2097152, PROT_NONE) = 0 mmap(0x30bc23b000, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_FIXED|MAP_DENYWRITE, 3, 0x3b000) = 0x30bc23b000 mmap(0x30bc23c000, 40224, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_FIXED|MAP_ANONYMOUS, -1, 0) = 0x30bc23c000 close(3) = 0 mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x2b8582cd6000 mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x2b8582cd7000 arch_prctl(ARCH_SET_FS, 0x2b8582cd6c50) = 0 mprotect(0x30bba07000, 4096, PROT_READ) = 0 mprotect(0x30bab4d000, 16384, PROT_READ) = 0 mprotect(0x30bb215000, 4096, PROT_READ) = 0 mprotect(0x30ba61b000, 4096, PROT_READ) = 0 mprotect(0x30bae02000, 4096, PROT_READ) = 0 munmap(0x2b8582cc1000, 71746) = 0 set_tid_address(0x2b8582cd6ce0) = 24102 set_robust_list(0x2b8582cd6cf0, 0x18) = 0 futex(0x7fff72d02d6c, FUTEX_WAKE_PRIVATE, 1) = 0 rt_sigaction(SIGRTMIN, {0x30bb005370, [], SA_RESTORER|SA_SIGINFO, 0x30bb00e7c0}, NULL, 8) = 0 rt_sigaction(SIGRT_1, {0x30bb0052b0, [], SA_RESTORER|SA_RESTART|SA_SIGINFO, 0x30bb00e7c0}, NULL, 8) = 0 rt_sigprocmask(SIG_UNBLOCK, [RTMIN RT_1], NULL, 8) = 0 getrlimit(RLIMIT_STACK, {rlim_cur=10240*1024, rlim_max=RLIM_INFINITY}) = 0 access("/etc/selinux/", F_OK) = 0 brk(0) = 0x1c521000 brk(0x1c542000) = 0x1c542000 open("/etc/selinux/config", O_RDONLY) = 3 fstat(3, {st_mode=S_IFREG|0644, st_size=448, ...}) = 0 mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x2b8582cc1000 read(3, "# This file controls the state o"..., 4096) = 448 read(3, "", 4096) = 0 close(3) = 0 munmap(0x2b8582cc1000, 4096) = 0 open("/proc/mounts", O_RDONLY) = 3 fstat(3, {st_mode=S_IFREG|0444, st_size=0, ...}) = 0 mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x2b8582cc1000 read(3, "rootfs / rootfs rw 0 0\n/dev/root"..., 4096) = 577 close(3) = 0 munmap(0x2b8582cc1000, 4096) = 0 open("/selinux/mls", O_RDONLY) = 3 read(3, "1", 19) = 1 close(3) = 0 socket(PF_FILE, SOCK_STREAM, 0) = 3 connect(3, {sa_family=AF_FILE, path="/var/run/setrans/.setrans-unix"...}, 110) = 0 sendmsg(3, {msg_name(0)=NULL, msg_iov(5)=[{"\1\0\0\0", 4}, {"\1\0\0\0", 4}, {"\1\0\0\0", 4}, {"\0", 1}, {"\0", 1}], msg_controllen=0, msg_flags=0}, MSG_NOSIGNAL) = 14 readv(3, [{"\1\0\0\0", 4}, {"\1\0\0\0", 4}, {"\0\0\0\0", 4}], 3) = 12 readv(3, [{"\0", 1}], 1) = 1 close(3) = 0 open("/usr/lib/locale/locale-archive", O_RDONLY) = 3 fstat(3, {st_mode=S_IFREG|0644, st_size=56413824, ...}) = 0 mmap(NULL, 56413824, PROT_READ, MAP_PRIVATE, 3, 0) = 0x2b8582cd8000 close(3) = 0 ioctl(1, SNDCTL_TMR_TIMEBASE or TCGETS, {B38400 opost isig icanon echo ...}) = 0 ioctl(1, TIOCGWINSZ, {ws_row=65, ws_col=137, ws_xpixel=0, ws_ypixel=0}) = 0 open("/usr/share/locale/locale.alias", O_RDONLY) = 3 fstat(3, {st_mode=S_IFREG|0644, st_size=2528, ...}) = 0 mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x2b85862a5000 read(3, "# Locale name alias data base.\n#"..., 4096) = 2528 read(3, "", 4096) = 0 close(3) = 0 munmap(0x2b85862a5000, 4096) = 0 open("/usr/share/locale/en_US.UTF-8/LC_TIME/coreutils.mo", O_RDONLY) = -1 ENOENT (No such file or directory) open("/usr/share/locale/en_US.utf8/LC_TIME/coreutils.mo", O_RDONLY) = -1 ENOENT (No such file or directory) open("/usr/share/locale/en_US/LC_TIME/coreutils.mo", O_RDONLY) = -1 ENOENT (No such file or directory) open("/usr/share/locale/en.UTF-8/LC_TIME/coreutils.mo", O_RDONLY) = -1 ENOENT (No such file or directory) open("/usr/share/locale/en.utf8/LC_TIME/coreutils.mo", O_RDONLY) = -1 ENOENT (No such file or directory) open("/usr/share/locale/en/LC_TIME/coreutils.mo", O_RDONLY) = -1 ENOENT (No such file or directory) lstat("/tmp", {st_mode=S_IFDIR|S_ISVTX|0777, st_size=4096, ...}) = 0 getxattr("/tmp", "system.posix_acl_access", 0x0, 0) = -1 ENODATA (No data available) getxattr("/tmp", "system.posix_acl_default", 0x0, 0) = -1 ENODATA (No data available) socket(PF_FILE, SOCK_STREAM, 0) = 3 fcntl(3, F_SETFL, O_RDWR|O_NONBLOCK) = 0 connect(3, {sa_family=AF_FILE, path="/var/run/nscd/socket"...}, 110) = -1 ENOENT (No such file or directory) close(3) = 0 socket(PF_FILE, SOCK_STREAM, 0) = 3 fcntl(3, F_SETFL, O_RDWR|O_NONBLOCK) = 0 connect(3, {sa_family=AF_FILE, path="/var/run/nscd/socket"...}, 110) = -1 ENOENT (No such file or directory) close(3) = 0 open("/etc/nsswitch.conf", O_RDONLY) = 3 fstat(3, {st_mode=S_IFREG|0644, st_size=1711, ...}) = 0 mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x2b85862a5000 read(3, "#\n# /etc/nsswitch.conf\n#\n# An ex"..., 4096) = 1711 read(3, "", 4096) = 0 close(3) = 0 munmap(0x2b85862a5000, 4096) = 0 open("/etc/ld.so.cache", O_RDONLY) = 3 fstat(3, {st_mode=S_IFREG|0644, st_size=71746, ...}) = 0 mmap(NULL, 71746, PROT_READ, MAP_PRIVATE, 3, 0) = 0x2b85862a5000 close(3) = 0 open("/lib64/libnss_files.so.2", O_RDONLY) = 3 read(3, "\177ELF\2\1\1\0\0\0\0\0\0\0\0\0\3\0>\0\1\0\0\0\340\37\0\0\0\0\0\0"..., 832) = 832 fstat(3, {st_mode=S_IFREG|0755, st_size=53880, ...}) = 0 mmap(NULL, 2139432, PROT_READ|PROT_EXEC, MAP_PRIVATE|MAP_DENYWRITE, 3, 0) = 0x2b85862b7000 mprotect(0x2b85862c1000, 2093056, PROT_NONE) = 0 mmap(0x2b85864c0000, 8192, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_FIXED|MAP_DENYWRITE, 3, 0x9000) = 0x2b85864c0000 close(3) = 0 mprotect(0x2b85864c0000, 4096, PROT_READ) = 0 munmap(0x2b85862a5000, 71746) = 0 open("/etc/passwd", O_RDONLY) = 3 fcntl(3, F_GETFD) = 0 fcntl(3, F_SETFD, FD_CLOEXEC) = 0 fstat(3, {st_mode=S_IFREG|0644, st_size=1823, ...}) = 0 mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x2b85862a5000 read(3, "root:x:0:0:root:/root:/bin/bash\n"..., 4096) = 1823 close(3) = 0 munmap(0x2b85862a5000, 4096) = 0 socket(PF_FILE, SOCK_STREAM, 0) = 3 fcntl(3, F_SETFL, O_RDWR|O_NONBLOCK) = 0 connect(3, {sa_family=AF_FILE, path="/var/run/nscd/socket"...}, 110) = -1 ENOENT (No such file or directory) close(3) = 0 socket(PF_FILE, SOCK_STREAM, 0) = 3 fcntl(3, F_SETFL, O_RDWR|O_NONBLOCK) = 0 connect(3, {sa_family=AF_FILE, path="/var/run/nscd/socket"...}, 110) = -1 ENOENT (No such file or directory) close(3) = 0 open("/etc/group", O_RDONLY) = 3 fcntl(3, F_GETFD) = 0 fcntl(3, F_SETFD, FD_CLOEXEC) = 0 fstat(3, {st_mode=S_IFREG|0644, st_size=743, ...}) = 0 mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x2b85862a5000 read(3, "root:x:0:root\nbin:x:1:root,bin,d"..., 4096) = 743 close(3) = 0 munmap(0x2b85862a5000, 4096) = 0 open("/tmp", O_RDONLY|O_NONBLOCK|O_DIRECTORY) = 3 fcntl(3, F_SETFD, FD_CLOEXEC) = 0 getdents(3, /* 8 entries */, 32768) = 264 lstat("/tmp/librsync-0.9.7.tar.gz", {st_mode=S_IFREG|0644, st_size=453802, ...}) = 0 getxattr("/tmp/librsync-0.9.7.tar.gz", "system.posix_acl_access", 0x0, 0) = -1 ENODATA (No data available) getxattr("/tmp/librsync-0.9.7.tar.gz", "system.posix_acl_default", 0x0, 0) = -1 ENODATA (No data available) lstat("/tmp/librsync-0.9.7", {st_mode=S_IFDIR|0777, st_size=4096, ...}) = 0 getxattr("/tmp/librsync-0.9.7", "system.posix_acl_access", 0x0, 0) = -1 ENODATA (No data available) getxattr("/tmp/librsync-0.9.7", "system.posix_acl_default", 0x0, 0) = -1 ENODATA (No data available) open("/etc/passwd", O_RDONLY) = 4 fcntl(4, F_GETFD) = 0 fcntl(4, F_SETFD, FD_CLOEXEC) = 0 fstat(4, {st_mode=S_IFREG|0644, st_size=1823, ...}) = 0 mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x2b85862a5000 read(4, "root:x:0:0:root:/root:/bin/bash\n"..., 4096) = 1823 read(4, "", 4096) = 0 close(4) = 0 munmap(0x2b85862a5000, 4096) = 0 open("/etc/ld.so.cache", O_RDONLY) = 4 fstat(4, {st_mode=S_IFREG|0644, st_size=71746, ...}) = 0 mmap(NULL, 71746, PROT_READ, MAP_PRIVATE, 4, 0) = 0x2b85862a5000 close(4) = 0 open("/lib64/libnss_ldap.so.2", O_RDONLY) = 4 read(4, "\177ELF\2\1\1\0\0\0\0\0\0\0\0\0\3\0>\0\1\0\0\0\300r\4\0\0\0\0\0"..., 832) = 832 fstat(4, {st_mode=S_IFREG|0755, st_size=3169960, ...}) = 0 mmap(NULL, 5329912, PROT_READ|PROT_EXEC, MAP_PRIVATE|MAP_DENYWRITE, 4, 0) = 0x2b85864c2000 mprotect(0x2b858679e000, 2093056, PROT_NONE) = 0 mmap(0x2b858699d000, 176128, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_FIXED|MAP_DENYWRITE, 4, 0x2db000) = 0x2b858699d000 mmap(0x2b85869c8000, 62456, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_FIXED|MAP_ANONYMOUS, -1, 0) = 0x2b85869c8000 close(4) = 0 open("/lib64/libcom_err.so.2", O_RDONLY) = 4 read(4, "\177ELF\2\1\1\0\0\0\0\0\0\0\0\0\3\0>\0\1\0\0\0\320\n\300\2770\0\0\0"..., 832) = 832 fstat(4, {st_mode=S_IFREG|0755, st_size=10000, ...}) = 0 mmap(0x30bfc00000, 2103048, PROT_READ|PROT_EXEC, MAP_PRIVATE|MAP_DENYWRITE, 4, 0) = 0x30bfc00000 mprotect(0x30bfc02000, 2093056, PROT_NONE) = 0 mmap(0x30bfe01000, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_FIXED|MAP_DENYWRITE, 4, 0x1000) = 0x30bfe01000 close(4) = 0 open("/lib64/libkeyutils.so.1", O_RDONLY) = 4 read(4, "\177ELF\2\1\1\0\0\0\0\0\0\0\0\0\3\0>\0\1\0\0\0`\n@\2760\0\0\0"..., 832) = 832 fstat(4, {st_mode=S_IFREG|0755, st_size=9472, ...}) = 0 mmap(0x30be400000, 2102416, PROT_READ|PROT_EXEC, MAP_PRIVATE|MAP_DENYWRITE, 4, 0) = 0x30be400000 mprotect(0x30be402000, 2093056, PROT_NONE) = 0 mmap(0x30be601000, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_FIXED|MAP_DENYWRITE, 4, 0x1000) = 0x30be601000 close(4) = 0 open("/lib64/libresolv.so.2", O_RDONLY) = 4 read(4, "\177ELF\2\1\1\0\0\0\0\0\0\0\0\0\3\0>\0\1\0\0\0\2402\0\2760\0\0\0"..., 832) = 832 fstat(4, {st_mode=S_IFREG|0755, st_size=92736, ...}) = 0 mmap(0x30be000000, 2181864, PROT_READ|PROT_EXEC, MAP_PRIVATE|MAP_DENYWRITE, 4, 0) = 0x30be000000 mprotect(0x30be011000, 2097152, PROT_NONE) = 0 mmap(0x30be211000, 8192, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_FIXED|MAP_DENYWRITE, 4, 0x11000) = 0x30be211000 mmap(0x30be213000, 6888, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_FIXED|MAP_ANONYMOUS, -1, 0) = 0x30be213000 close(4) = 0 mprotect(0x30be211000, 4096, PROT_READ) = 0 munmap(0x2b85862a5000, 71746) = 0 rt_sigaction(SIGPIPE, {0x1, [], SA_RESTORER, 0x30ba8302d0}, {SIG_DFL, [], 0}, 8) = 0 geteuid() = 0 futex(0x2b85869c7708, FUTEX_WAKE_PRIVATE, 2147483647) = 0 open("/etc/ldap.conf", O_RDONLY) = 4 fstat(4, {st_mode=S_IFREG|0644, st_size=9119, ...}) = 0 fstat(4, {st_mode=S_IFREG|0644, st_size=9119, ...}) = 0 mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x2b85862a5000 read(4, "# @(#)$Id: ldap.conf,v 1.38 2006"..., 4096) = 4096 read(4, "Use the OpenLDAP password change"..., 4096) = 4096 read(4, " OpenLDAP 2.0 and earlier is \"no"..., 4096) = 927 read(4, "", 4096) = 0 close(4) = 0 munmap(0x2b85862a5000, 4096) = 0 uname({sys="Linux", node="massive.answeron.com", ...}) = 0 open("/etc/resolv.conf", O_RDONLY) = 4 fstat(4, {st_mode=S_IFREG|0644, st_size=107, ...}) = 0 mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x2b85862a5000 read(4, "; generated by /sbin/dhclient-sc"..., 4096) = 107 read(4, "", 4096) = 0 close(4) = 0 munmap(0x2b85862a5000, 4096) = 0 socket(PF_FILE, SOCK_STREAM, 0) = 4 fcntl(4, F_SETFL, O_RDWR|O_NONBLOCK) = 0 connect(4, {sa_family=AF_FILE, path="/var/run/nscd/socket"...}, 110) = -1 ENOENT (No such file or directory) close(4) = 0 socket(PF_FILE, SOCK_STREAM, 0) = 4 fcntl(4, F_SETFL, O_RDWR|O_NONBLOCK) = 0 connect(4, {sa_family=AF_FILE, path="/var/run/nscd/socket"...}, 110) = -1 ENOENT (No such file or directory) close(4) = 0 open("/etc/host.conf", O_RDONLY) = 4 fstat(4, {st_mode=S_IFREG|0644, st_size=17, ...}) = 0 mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x2b85862a5000 read(4, "order hosts,bind\n", 4096) = 17 read(4, "", 4096) = 0 close(4) = 0 munmap(0x2b85862a5000, 4096) = 0 futex(0x30bab54d44, FUTEX_WAKE_PRIVATE, 2147483647) = 0 open("/etc/hosts", O_RDONLY) = 4 fcntl(4, F_GETFD) = 0 fcntl(4, F_SETFD, FD_CLOEXEC) = 0 fstat(4, {st_mode=S_IFREG|0644, st_size=187, ...}) = 0 mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x2b85862a5000 read(4, "# Do not remove the following li"..., 4096) = 187 read(4, "", 4096) = 0 close(4) = 0 munmap(0x2b85862a5000, 4096) = 0 open("/etc/ld.so.cache", O_RDONLY) = 4 fstat(4, {st_mode=S_IFREG|0644, st_size=71746, ...}) = 0 mmap(NULL, 71746, PROT_READ, MAP_PRIVATE, 4, 0) = 0x2b85862a5000 close(4) = 0 open("/lib64/libnss_dns.so.2", O_RDONLY) = 4 read(4, "\177ELF\2\1\1\0\0\0\0\0\0\0\0\0\3\0>\0\1\0\0\0\340\17\0\0\0\0\0\0"..., 832) = 832 fstat(4, {st_mode=S_IFREG|0755, st_size=23736, ...}) = 0 mmap(NULL, 2113792, PROT_READ|PROT_EXEC, MAP_PRIVATE|MAP_DENYWRITE, 4, 0) = 0x2b85869d8000 mprotect(0x2b85869dc000, 2093056, PROT_NONE) = 0 mmap(0x2b8586bdb000, 8192, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_FIXED|MAP_DENYWRITE, 4, 0x3000) = 0x2b8586bdb000 close(4) = 0 mprotect(0x2b8586bdb000, 4096, PROT_READ) = 0 munmap(0x2b85862a5000, 71746) = 0 socket(PF_INET, SOCK_DGRAM, IPPROTO_IP) = 4 connect(4, {sa_family=AF_INET, sin_port=htons(53), sin_addr=inet_addr("192.168.10.20")}, 28) = 0 fcntl(4, F_GETFL) = 0x2 (flags O_RDWR) fcntl(4, F_SETFL, O_RDWR|O_NONBLOCK) = 0 gettimeofday({1276265920, 823870}, NULL) = 0 poll([{fd=4, events=POLLOUT}], 1, 0) = 1 ([{fd=4, revents=POLLOUT}]) sendto(4, "C\v\1\0\0\1\0\0\0\0\0\0\7massive\10answeron\3co"..., 38, MSG_NOSIGNAL, NULL, 0) = 38 poll([{fd=4, events=POLLIN}], 1, 5000) = 1 ([{fd=4, revents=POLLIN}]) ioctl(4, FIONREAD, [122]) = 0 recvfrom(4, "C\v\205\200\0\1\0\1\0\2\0\2\7massive\10answeron\3co"..., 1024, 0, {sa_family=AF_INET, sin_port=htons(53), sin_addr=inet_addr("192.168.10.20")}, [16]) = 122 close(4) = 0 open("/etc/openldap/ldap.conf", O_RDONLY) = 4 fstat(4, {st_mode=S_IFREG|0644, st_size=335, ...}) = 0 mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x2b85862a5000 read(4, "#\n# LDAP Defaults\n#\n\n# See ldap."..., 4096) = 335 read(4, "", 4096) = 0 close(4) = 0 munmap(0x2b85862a5000, 4096) = 0 getuid() = 0 geteuid() = 0 getgid() = 0 getegid() = 0 open("/root/ldaprc", O_RDONLY) = -1 ENOENT (No such file or directory) open("/root/.ldaprc", O_RDONLY) = -1 ENOENT (No such file or directory) stat("/etc/ldap.conf", {st_mode=S_IFREG|0644, st_size=9119, ...}) = 0 geteuid() = 0 brk(0x1c566000) = 0x1c566000 open("/etc/hosts", O_RDONLY) = 4 fcntl(4, F_GETFD) = 0 fcntl(4, F_SETFD, FD_CLOEXEC) = 0 fstat(4, {st_mode=S_IFREG|0644, st_size=187, ...}) = 0 mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x2b85862a5000 read(4, "# Do not remove the following li"..., 4096) = 187 read(4, "", 4096) = 0 close(4) = 0 munmap(0x2b85862a5000, 4096) = 0 open("/etc/hosts", O_RDONLY) = 4 fcntl(4, F_GETFD) = 0 fcntl(4, F_SETFD, FD_CLOEXEC) = 0 fstat(4, {st_mode=S_IFREG|0644, st_size=187, ...}) = 0 mmap(NULL, 4096, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS, -1, 0) = 0x2b85862a5000 read(4, "# Do not remove the following li"..., 4096) = 187 read(4, "", 4096) = 0 close(4) = 0 munmap(0x2b85862a5000, 4096) = 0 socket(PF_INET, SOCK_DGRAM, IPPROTO_IP) = 4 connect(4, {sa_family=AF_INET, sin_port=htons(53), sin_addr=inet_addr("192.168.10.20")}, 28) = 0 fcntl(4, F_GETFL) = 0x2 (flags O_RDWR) fcntl(4, F_SETFL, O_RDWR|O_NONBLOCK) = 0 gettimeofday({1276265920, 855948}, NULL) = 0 poll([{fd=4, events=POLLOUT}], 1, 0) = 1 ([{fd=4, revents=POLLOUT}]) sendto(4, "\32 \1\0\0\1\0\0\0\0\0\0\4ldap\10answeron\3com\0\0"..., 35, MSG_NOSIGNAL, NULL, 0) = 35 poll([{fd=4, events=POLLIN}], 1, 5000) = 1 ([{fd=4, revents=POLLIN}]) ioctl(4, FIONREAD, [104]) = 0 recvfrom(4, "\32 \205\200\0\1\0\1\0\1\0\0\4ldap\10answeron\3com\0\0"..., 1024, 0, {sa_family=AF_INET, sin_port=htons(53), sin_addr=inet_addr("192.168.10.20")}, [16]) = 104 close(4) = 0 socket(PF_INET, SOCK_DGRAM, IPPROTO_IP) = 4 connect(4, {sa_family=AF_INET, sin_port=htons(53), sin_addr=inet_addr("192.168.10.20")}, 28) = 0 fcntl(4, F_GETFL) = 0x2 (flags O_RDWR) fcntl(4, F_SETFL, O_RDWR|O_NONBLOCK) = 0 gettimeofday({1276265920, 858536}, NULL) = 0 poll([{fd=4, events=POLLOUT}], 1, 0) = 1 ([{fd=4, revents=POLLOUT}]) sendto(4, "I\375\1\0\0\1\0\0\0\0\0\0\4ldap\10answeron\3com\0\0"..., 35, MSG_NOSIGNAL, NULL, 0) = 35 poll([{fd=4, events=POLLIN}], 1, 5000) = 1 ([{fd=4, revents=POLLIN}]) ioctl(4, FIONREAD, [139]) = 0 recvfrom(4, "I\375\205\200\0\1\0\2\0\2\0\2\4ldap\10answeron\3com\0\0"..., 1024, 0, {sa_family=AF_INET, sin_port=htons(53), sin_addr=inet_addr("192.168.10.20")}, [16]) = 139 close(4) = 0 socket(PF_INET, SOCK_STREAM, IPPROTO_IP) = 4 fcntl(4, F_SETFD, FD_CLOEXEC) = 0 setsockopt(4, SOL_SOCKET, SO_KEEPALIVE, [1], 4) = 0 setsockopt(4, SOL_TCP, TCP_NODELAY, [1], 4) = 0 fcntl(4, F_GETFL) = 0x2 (flags O_RDWR) fcntl(4, F_SETFL, O_RDWR|O_NONBLOCK) = 0 connect(4, {sa_family=AF_INET, sin_port=htons(389), sin_addr=inet_addr("10.20.0.30")}, 16) = -1 EINPROGRESS (Operation now in progress) poll([{fd=4, events=POLLOUT|POLLERR|POLLHUP}], 1, 120000 And thats where it stops, right there after that last 120000.... Using strace, I can obviously CTRL+C to keep going. But like I said, normally the terminal completely freezes. Anyone have any clues?

    Read the article

  • Software Engineering Practices &ndash; Different Projects should have different maturity levels

    - by Dylan Smith
    I’ve had a lot of discussions at the office lately about the drastically different sets of software engineering practices used on our various projects, if what we are doing is appropriate, and what factors should you be considering when determining what practices are most appropriate in a given context. I wanted to write up my thoughts in a little more detail on this subject, so here we go: If you compare any two software projects (specifically comparing their codebases) you’ll often see very different levels of maturity in the software engineering practices employed. By software engineering practices, I’m specifically referring to the quality of the code and the amount of technical debt present in the project. Things such as Test Driven Development, Domain Driven Design, Behavior Driven Development, proper adherence to the SOLID principles, etc. are all practices that you would expect at the mature end of the spectrum. At the other end of the spectrum would be the quick-and-dirty solutions that are done using something like an Access Database, Excel Spreadsheet, or maybe some quick “drag-and-drop coding”. For this blog post I’m going to refer to this as the Software Engineering Maturity Spectrum (SEMS). I believe there is a time and a place for projects at every part of that SEMS. The risks and costs associated with under-engineering solutions have been written about a million times over so I won’t bother going into them again here, but there are also (unnecessary) costs with over-engineering a solution. Sometimes putting multiple layers, and IoC containers, and abstracting out the persistence, etc is complete overkill if a one-time use Access database could solve the problem perfectly well. A lot of software developers I talk to seem to automatically jump to the very right-hand side of this SEMS in everything they do. A common rationalization I hear is that it may seem like a small trivial application today, but these things always grow and stick around for many years, then you’re stuck maintaining a big ball of mud. I think this is a cop-out. Sure you can’t always anticipate how an application will be used or grow over its lifetime (can you ever??), but that doesn’t mean you can’t manage it and evolve the underlying software architecture as necessary (even if that means having to toss the code out and re-write it at some point…maybe even multiple times). My thoughts are that we should be making a conscious decision around the start of each project approximately where on the SEMS we want the project to exist. I believe this decision should be based on 3 factors: 1. Importance - How important to the business is this application? What is the impact if the application were to suddenly stop working? 2. Complexity - How complex is the application functionality? 3. Life-Expectancy - How long is this application expected to be in use? Is this a one-time use application, does it fill a short-term need, or is it more strategic and is expected to be in-use for many years to come? Of course this isn’t an exact science. You can’t say that Project X should be at the 73% mark on the SEMS and expect that to be helpful. My point is not that you need to precisely figure out what point on the SEMS the project should be at then translate that into some prescriptive set of practices and techniques you should be using. Rather my point is that we need to be aware that there is a spectrum, and that not everything is going to be (or should be) at the edges of that spectrum, indeed a large number of projects should probably fall somewhere within the middle; and different projects should adopt a different level of software engineering practices and maturity levels based on the needs of that project. To give an example of this way of thinking from my day job: Every couple of years my company plans and hosts a large event where ~400 of our customers all fly in to one location for a multi-day event with various activities. We have some staff whose job it is to organize the logistics of this event, which includes tracking which flights everybody is booked on, arranging for transportation to/from airports, arranging for hotel rooms, name tags, etc The last time we arranged this event all these various pieces of data were tracked in separate spreadsheets and reconciliation and cross-referencing of all the data was literally done by hand using printed copies of the spreadsheets and several people sitting around a table going down each list row by row. Obviously there is some room for improvement in how we are using software to manage the event’s logistics. The next time this event occurs we plan to provide the event planning staff with a more intelligent tool (either an Excel spreadsheet or probably an Access database) that can track all the information in one location and make sure that the various pieces of data are properly linked together (so for example if a person cancels you only need to delete them from one place, and not a dozen separate lists). This solution would fall at or near the very left end of the SEMS meaning that we will just quickly create something with very little attention paid to using mature software engineering practices. If we examine this project against the 3 criteria I listed above for determining it’s place within the SEMS we can see why: Importance – If this application were to stop working the business doesn’t grind to a halt, revenue doesn’t stop, and in fact our customers wouldn’t even notice since it isn’t a customer facing application. The impact would simply be more work for our event planning staff as they revert back to the previous way of doing things (assuming we don’t have any data loss). Complexity – The use cases for this project are pretty straightforward. It simply needs to manage several lists of data, and link them together appropriately. Precisely the task that access (and/or Excel) can do with minimal custom development required. Life-Expectancy – For this specific project we’re only planning to create something to be used for the one event (we only hold these events every 2 years). If it works well this may change (see below). Let’s assume we hack something out quickly and it works great when we plan the next event. We may decide that we want to make some tweaks to the tool and adopt it for planning all future events of this nature. In that case we should examine where the current application is on the SEMS, and make a conscious decision whether something needs to be done to move it further to the right based on the new objectives and goals for this application. This may mean scrapping the access database and re-writing it as an actual web or windows application. In this case, the life-expectancy changed, but let’s assume the importance and complexity didn’t change all that much. We can still probably get away with not adopting a lot of the so-called “best practices”. For example, we can probably still use some of the RAD tooling available and might have an Autonomous View style design that connects directly to the database and binds to typed datasets (we might even choose to simply leave it as an access database and continue using it; this is a decision that needs to be made on a case-by-case basis). At Anvil Digital we have aspirations to become a primarily product-based company. So let’s say we use this tool to plan a handful of events internally, and everybody loves it. Maybe a couple years down the road we decide we want to package the tool up and sell it as a product to some of our customers. In this case the project objectives/goals change quite drastically. Now the tool becomes a source of revenue, and the impact of it suddenly stopping working is significantly less acceptable. Also as we hold focus groups, and gather feedback from customers and potential customers there’s a pretty good chance the feature-set and complexity will have to grow considerably from when we were using it only internally for planning a small handful of events for one company. In this fictional scenario I would expect the target on the SEMS to jump to the far right. Depending on how we implemented the previous release we may be able to refactor and evolve the existing codebase to introduce a more layered architecture, a robust set of automated tests, introduce a proper ORM and IoC container, etc. More likely in this example the jump along the SEMS would be so large we’d probably end up scrapping the current code and re-writing. Although, if it was a slow phased roll-out to only a handful of customers, where we collected feedback, made some tweaks, and then rolled out to a couple more customers, we may be able to slowly refactor and evolve the code over time rather than tossing it out and starting from scratch. The key point I’m trying to get across is not that you should be throwing out your code and starting from scratch all the time. But rather that you should be aware of when and how the context and objectives around a project changes and periodically re-assess where the project currently falls on the SEMS and whether that needs to be adjusted based on changing needs. Note: There is also the idea of “spectrum decay”. Since our industry is rapidly evolving, what we currently accept as mature software engineering practices (the right end of the SEMS) probably won’t be the same 3 years from now. If you have a project that you were to assess at somewhere around the 80% mark on the SEMS today, but don’t touch the code for 3 years and come back and re-assess its position, it will almost certainly have changed since the right end of the SEMS will have moved farther out (maybe the project is now only around 60% due to decay). Developer Skills Another important aspect to this whole discussion is around the skill sets of your architects and lead developers. When talking about the progression of a developers skills from junior->intermediate->senior->… they generally start by only being able to write code that belongs on the left side of the SEMS and as they gain more knowledge and skill they become capable of working at a higher and higher level along the SEMS. We all realize that the learning never stops, but eventually you’ll get to the point where you can comfortably develop at the right-end of the SEMS (the exact practices and techniques that translates to is constantly changing, but that’s not the point here). A critical skill that I’d love to see more evidence of in our industry is the most senior guys not only being able to work at the right-end of the SEMS, but more importantly be able to consciously work at any point along the SEMS as project needs dictate. An even more valuable skill would be if you could make the conscious decision to move a projects code further right on the SEMS (based on changing needs) and do so in an incremental manner without having to start from scratch. An exercise that I’m planning to go through with all of our projects here at Anvil in the near future is to map out where I believe each project currently falls within this SEMS, where I believe the project *should* be on the SEMS based on the business needs, and for those that don’t match up (i.e. most of them) come up with a plan to improve the situation.

    Read the article

  • SQL SERVER – Weekly Series – Memory Lane – #049

    - by Pinal Dave
    Here is the list of selected articles of SQLAuthority.com across all these years. Instead of just listing all the articles I have selected a few of my most favorite articles and have listed them here with additional notes below it. Let me know which one of the following is your favorite article from memory lane. 2007 Two Connections Related Global Variables Explained – @@CONNECTIONS and @@MAX_CONNECTIONS @@CONNECTIONS Returns the number of attempted connections, either successful or unsuccessful since SQL Server was last started. @@MAX_CONNECTIONS Returns the maximum number of simultaneous user connections allowed on an instance of SQL Server. The number returned is not necessarily the number currently configured. Query Editor – Microsoft SQL Server Management Studio This post may be very simple for most of the users of SQL Server 2005. Earlier this year, I have received one question many times – Where is Query Analyzer in SQL Server 2005? I wrote small post about it and pointed many users to that post – SQL SERVER – 2005 Query Analyzer – Microsoft SQL SERVER Management Studio. Recently I have been receiving similar question. OUTPUT Clause Example and Explanation with INSERT, UPDATE, DELETE SQL Server 2005 has a new OUTPUT clause, which is quite useful. OUTPUT clause has access to insert and deleted tables (virtual tables) just like triggers. OUTPUT clause can be used to return values to client clause. OUTPUT clause can be used with INSERT, UPDATE, or DELETE to identify the actual rows affected by these statements. OUTPUT clause can generate a table variable, a permanent table, or temporary table. Even though, @@Identity will still work with SQL Server 2005, however I find the OUTPUT clause very easy and powerful to use. Let us understand the OUTPUT clause using an example. Find Name of The SQL Server Instance Based on database server stored procedures has to run different logic. We came up with two different solutions. 1) When database schema is very much changed, we wrote completely new stored procedure and deprecated older version once it was not needed. 2) When logic depended on Server Name we used global variable @@SERVERNAME. It was very convenient while writing migrating script which depended on the server name for the same database. Explanation of TRY…CATCH and ERROR Handling With RAISEERROR Function One of the developers at my company thought that we can not use the RAISEERROR function in new feature of SQL Server 2005 TRY… CATCH. When asked for an explanation he suggested SQL SERVER – 2005 Explanation of TRY… CATCH and ERROR Handling article as excuse suggesting that I did not give example of RAISEERROR with TRY…CATCH. We all thought it was funny. Just to keep records straight, TRY… CATCH can sure use RAISEERROR function. Different Types of Cache Objects Serveral kinds of objects can be stored in the procedure cache: Compiled Plans: When the query optimizer finishes compiling a query plan, the principal output is compiled plan. Execution contexts: While executing a compiled plan, SQL Server has to keep track of information about the state of execution. Cursors: Cursors track the execution state of server-side cursors, including the cursor’s current location within a resultset. Algebrizer trees: The Algebrizer’s job is to produce an algebrizer tree, which represents the logic structure of a query. Open SSMS From Command Prompt – sqlwb.exe Example This article is written by request and suggestion of Sr. Web Developer at my organization. Due to the nature of this article most of the content is referred from Book On-Line. sqlwbcommand prompt utility which opens SQL Server Management Studio. Squib command does not run queries from the command prompt. sqlcmd utility runs queries from command prompt, read for more information. 2008 Puzzle – Solution – Computed Columns Datatype Explanation Just a day before I wrote article SQL SERVER – Puzzle – Computed Columns Datatype Explanation which was inspired by SQL Server MVP Jacob Sebastian. I suggest that before continuing this article read the original puzzle question SQL SERVER – Puzzle – Computed Columns Datatype Explanation.The question was if the computed column was of datatype TINYINT how to create a Computed Column of datatype INT? 2008 – Find If Index is Being Used in Database It is very often I get a query that how to find if any index is being used in the database or not. If any database has many indexes and not all indexes are used it can adversely affect performance. If the number of indices are higher it reduces the INSERT / UPDATE / DELETE operation but increase the SELECT operation. It is recommended to drop any unused indexes from table to improve the performance. 2009 Interesting Observation – Execution Plan and Results of Aggregate Concatenation Queries If you want to see what’s going on here, I think you need to shift your point of view from an implementation-centric view to an ANSI point of view. ANSI does not guarantee processing the order. Figure 2 is interesting, but it will be potentially misleading if you don’t understand the ANSI rule-set SQL Server operates under in most cases. Implementation thinking can certainly be useful at times when you really need that multi-million row query to finish before the backup fire off, but in this case, it’s counterproductive to understanding what is going on. SQL Server Management Studio and Client Statistics Client Statistics are very important. Many a times, people relate queries execution plan to query cost. This is not a good comparison. Both parameters are different, and they are not always related. It is possible that the query cost of any statement is less, but the amount of the data returned is considerably larger, which is causing any query to run slow. How do we know if any query is retrieving a large amount data or very little data? 2010 I encourage all of you to go through complete series and write your own on the subject. If you write an article and send it to me, I will publish it on this blog with due credit to you. If you write on your own blog, I will update this blog post pointing to your blog post. SQL SERVER – ORDER BY Does Not Work – Limitation of the View 1 SQL SERVER – Adding Column is Expensive by Joining Table Outside View – Limitation of the View 2 SQL SERVER – Index Created on View not Used Often – Limitation of the View 3 SQL SERVER – SELECT * and Adding Column Issue in View – Limitation of the View 4 SQL SERVER – COUNT(*) Not Allowed but COUNT_BIG(*) Allowed – Limitation of the View 5 SQL SERVER – UNION Not Allowed but OR Allowed in Index View – Limitation of the View 6 SQL SERVER – Cross Database Queries Not Allowed in Indexed View – Limitation of the View 7 SQL SERVER – Outer Join Not Allowed in Indexed Views – Limitation of the View 8 SQL SERVER – SELF JOIN Not Allowed in Indexed View – Limitation of the View 9 SQL SERVER – Keywords View Definition Must Not Contain for Indexed View – Limitation of the View 10 SQL SERVER – View Over the View Not Possible with Index View – Limitations of the View 11 SQL SERVER – Get Query Running in Session I was recently looking for syntax where I needed a query running in any particular session. I always remembered the syntax and ha d actually written it down before, but somehow it was not coming to mind quickly this time. I searched online and I ended up on my own article written last year SQL SERVER – Get Last Running Query Based on SPID. I felt that I am getting old because I forgot this really simple syntax. Find Total Number of Transaction on Interval In one of my recent Performance Tuning assignments I was asked how do someone know how many transactions are happening on a server during certain interval. I had a handy script for the same. Following script displays transactions happened on the server at the interval of one minute. You can change the WAITFOR DELAY to any other interval and it should work. 2011 Here are two DMV’s which are newly introduced in SQL Server 2012 and provides vital information about SQL Server. DMV – sys.dm_os_volume_stats – Information about operating system volume DMV – sys.dm_os_windows_info – Information about Operating System SQL Backup and FTP – A Quick and Handy Tool I have used this tool extensively since 2009 at numerous occasion and found it to be very impressive. What separates it from the crowd the most – it is it’s apparent simplicity and speed. When I install SQLBackupAndFTP and configure backups – all in 1 or 2 minutes, my clients are always impressed. Quick Note about JOIN – Common Questions and Simple Answers In this blog post we are going to talk about join and lots of things related to the JOIN. I recently started office hours to answer questions and issues of the community. I receive so many questions that are related to JOIN. I will share a few of the same over here. Most of them are basic, but note that the basics are of great importance. 2012 Importance of User Without Login Question: “In recent version of SQL Server we can create user without login. What is the use of it?” Great question indeed. Let me first attempt to answer this question but after reading my answer I need your help. I want you to help him as well with adding more value to it. Preserve Leading Zero While Coping to Excel from SSMS Earlier I wrote two articles about how to efficiently copy data from SSMS to Excel. Since I wrote that post there are plenty of interest generated on this subject. There are a few questions I keep on getting over this subject. One of the question is how to get the leading zero preserved while copying the data from SSMS to Excel. Well it is almost the same way as my earlier post SQL SERVER – Excel Losing Decimal Values When Value Pasted from SSMS ResultSet. The key here is in EXCEL and not in SQL Server. Solution – 2 T-SQL Puzzles – Display Star and Shortest Code to Display 1 Earlier on this blog we had asked two puzzles. The response from all of you is nothing but Amazing. I have received 350+ responses. Many are valid and many were indeed something I had not thought about it. I strongly suggest you read all the puzzles and their answers here - trust me if you start reading the comments you will not stop till you read every single comment. Seriously trust me on it. Personally I have learned a lot from it. Identify Most Resource Intensive Queries – SQL in Sixty Seconds #028 – Video http://www.youtube.com/watch?v=TvlYy-TGaaA Importance of User Without Login – T-SQL Demo Script Earlier I wrote a blog post about SQL SERVER – Importance of User Without Login and my friend and SQL Expert Vinod Kumar has written excellent follow up blog post about Contained Databases inside SQL Server 2012. Now lots of people asked me if I can also explain the same concept again so here is the small demonstration for it. Let me show you how login without user can help. Before we continue on this subject I strongly recommend that you read my earlier blog post here. In following demo I am going to demonstrate following situation. Login using the System Admin account Create a user without login Checking Access Impersonate the user without login Checking Access Revert Impersonation Give Permission to user without login Impersonate the user without login Checking Access Revert Impersonation Clean up Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Memory Lane, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • O&rsquo;Reilly Deals to 9/June/2014 05:00 PT&ndash;50% off E-Books on Regular Expressions

    - by TATWORTH
    Originally posted on: http://geekswithblogs.net/TATWORTH/archive/2014/06/06/orsquoreilly-deals-to-9june2014-0500-ptndash50-off-e-books-on-regular.aspxUntil 9/June/2014 05:00 PT, O’Reilly are offering 50% off E-Books on Regular Expressions at http://shop.oreilly.com/category/deals/regular-expressions-owo.do?code=DEAL&imm_mid=0bd938&cmp=em-prog-books-videos-lp-dod_regex. “Regular expressions—powerful tools for manipulating text and data—are now standard features in most languages and tools. Yet despite their widespread availability and unparalleled power, regular expressions are frequently underused. With ebooks and videos from shop.oreilly.com, learn tips for matching, extracting, and transforming text and data. Today only, save 50% and discover the epic functionality of Reg Ex.”

    Read the article

  • database api commands

    - by Rahul Mehta
    As I am developing database api for a project. I am developing commands for getting data from database. e.g. i have one gib table so command for that is getgib name alias limit fields if user pass the name e.g. getgib rahul than it will return all the gib data whose name is like rahul. if alias is given than it will return the all the gib owned by the user whose alias(userid) given . So i want to design the commands. limit : is to limit the record in query, fields : is the extra fields i want to add in the select query . so as now commands are set but now Question 1 : i want the gibs by the gibid , so how to make this or any suggestion to improve my command is welcome. Question 2 : if user don't want to specify the name , and he want only the gibs by providing alias then at this what separator at the place of name i should used.

    Read the article

  • CodePlex Daily Summary for Sunday, March 21, 2010

    CodePlex Daily Summary for Sunday, March 21, 2010New ProjectsAdaptCMS: AdaptCMS is an open source CMS that is made for complete control of your website, easiness of use and easily adaptable to any type of website. It's...Aura: Aura is a application that calculates average color of desktop background image or whole screen and sets it as Aero Glass color.Boxee Launcher: Boxee Launcher is a simple Windows Media Center add-in that attempts to launch Boxee and manage the windows as seamlessly as possible.ClothingSMS: ClothingSMSEasySL3ColorPicker: Silverlight 3 color picker user control.Fluent Moq Builder: Gives a fluent interface to help with building complex mock objects with Moq, such as mocks of HttpContextBase for help in unit testing ASP.NET MVC.Folder Bookmarks: A simple folder/file bookmark manager. Simple and easy to use. No need to navigate large folder directories, just double-click the bookmark to open...GeocodeThe.Net: GeoCodeThe.Net seeks to promote geographic tagging for all content on the web. It is our belief that anything on the web can and should be geocoded...GNF: GNF is a open source WPF based GPS library controlsHKGolden Express: HKGolden Express is a web application to generate simplified layout of HKGolden forum. HKGolden Express is written in Java EE, it can be deployed o...Informant: Informant provides people with the desire to send mass SMS to specific groups with the ability to do so using Google Voice. Included with Informant...JSON Object Serializer .Net 1.1: JSON serializer is used to created javascript object notation strings. It was written in the .NET 1.1 framework, and has capabilities of serializ...LightWeight Application Server: LWAS aims to provide the means for non-technical developers using just a web browser to create data-centered applications from user interface desig...MicroHedge: Quant FiNerd Clock: NerdClock is my windows phone 7 test app. A clock for nerds, time reads in binary.PhotoHelper: PhotoHelper makes it easier to organize the photoes, if your photoes are put into different locations, but you think they are the same category, yo...Pylor: An ASP.NET MVC custom attribute that allows the configuration of role based security access rules with similar functionality to the System.Web.Mvc....radiogaga: Access an online data source of internet streaming media and present it using a mixed paradigm of embedded web browser and rich client functionalit...Register WCF LOB Adapter MSBuild Task: If you would like to use MSBuild to register a WCF LOB Adapter in a given server, the custom tasks: RegisterWCFLOBAdapter and UnregisterWCFLOBAdapt...Restart Explorer: Utility to start, stop and restart Windows Explorer.Silverlight 4 Netflix Browser: Demonstrates using a WCF Data Client in Silverlight 4 to browse movie titles with the Netflix OData API announced at MIX 10.trayTwittr: With trayTwittr you can easily update your Twitterstatus right from the Systray. The GUI is designed like the Notificationpanels in Windows 7 (e.g....Warensoft Socket Server: Warensoft Socket Server is a solo server, which never cares about the real logical business. While you could process your socket message with IronP...Weka - Message Serialization: Message serialization framework for .net, including Compact Framework.New Releases[Tool] Vczh Visual Studio UnitTest Coverage Analyzer: Coverage Analyzer (beta): Done features: Load Visual Studio Code Coverage XML File (get this file by clicking "Export Results" in "Test->Windows->Code Coverage Results" in V...Aura: Aura Beta 1: Initial releaseBoxee Launcher: BoxeeLauncher Release 1.0.1.0: BoxeeLauncher Release 1.0.1.0 is the initial, barely-tested release of this Windows Media Center add-in. It should work in Vista Media Center and 7...Controlled Vocabulary: 1.0.0.2: System Requirements Outlook 2007 / 2010 .Net Framework 3.5 Installation 1. Close Outlook (Use Task Manager to ensure no running instances in the b...CycleMania Starter Kit EAP - ASP.NET 4 Problem - Design - Solution: Cyclemania 0.08.33: removed ASP.NET Menu from admin module applied security role filtering to Dashboard panels/tabsDDDSample.Net: 0.7: This is the next major release of DDDSample. This time I give you 4 different versions of the application: Classic (vanilla) with synchronous inter...DirectoryInfoEx: DirectoryInfoEx 0.16: 03-14-10 Version 0.13 o Fixed FileSystemWaterEx ignore remove directory event. o Fixed Removed IDisposable ...Employee Scheduler: Employee Scheduler [2.6]: Fixed clear data methods to handle holiday modification Added buttons to allow holiday and add time exceptions Updated drag/drop and resize of holi...Enovatics Foundation Library: Enovatics Foundation Library V1.4: This version provides the following components : Strongly Typed cache management, CSV management Base classes for SQL Server data access laye...Fluent Moq Builder: Version 0.1: Intial release. Contains (incomplete) builders for HttpRequestBase, HttpContextBase and ControllerContext. Mock methods so far focus around request...Folder Bookmarks: Folder Bookmarks 1.4: This is the latest version of Folder Bookmarks (1.4). It has an installer - it will create a directory 'CPascoe' in My Documents. Once you have ex...Folder Bookmarks: Source Code: This has the all the code for Folder Bookmarks in a Text file.Genesis Smart Client Framework: Genesis Smart Client Framework v1.60.1024.1: This release features the first installer for the Genesis Smart Client Framework. The installer creates the database and set's up the Internet Info...HKGolden Express: HKGoldenExpress (Build 201003201725): New features: (None) Bug fix: (None) Improvements: Added <meta> tag to optimize screen layout for different screen size. Added drop-down li...Home Access Plus+: v3.1.5.0: Version 3.1.5.0 Release Change Log: Fixed CSS for My Computer in List View Ability to remember which view mode you have selected Added HA+ home...IT Tycoon: IT Tycoon 0.2.0: Started refactoring toward more formatted and documented code and XAML.JSON Object Serializer .Net 1.1: jsonSerializer: Basic jsonSerializer binary. Now only handles an object model using reflection. There's no optimization added to the codebase handling .Net Refle...LightWeight Application Server: 0.4.0: 2010-03-20 lwas 0.4.0 This release is intended for c# developers audience only. Developed with MS vWD 2008 Express with .NET 3.5 and writen in c#....Microsoft Dynamics CRM 4.0 Marketing List Member Importer: Nocelab ExcelAddin - Release 2.0: Release note: - new installation procedure - fix some bugs related with the import procedure - errors during the import are displayed in red bold ...MSBuild Mercurial Tasks: 0.2.1 Stable: This release realises the Scenario 2 and provides two MSBuild tasks: HgCommit and HgPush. This task allows to create a new changeset in the current...NetSockets: NetBox (Example): Example application using the NetSockets library.NetSockets: NetSockets: The NetSockets library (DLL)Open Dotnet CMS: Open Dotnet CMS 1.6.2: This release update Open Dotnet CMS Console which now uses the modulare client application framework provided by Viking.Windows.Form library. The ...Open Portal Foundation: Open Portal Foundation V1.4: This release updates templates and theming library, and templates are now thematizable. This release also provides a better sample site and online ...PHPWord: PHPWord 0.6.0 Beta: Changelog: Added support for hyperlinks (inserting and formatting) Added support for GD images (created by PHP) Added support for Full-Table-St...Plurk.NET API and Client Applications: Plurk API Component: Plurk API Component is a wrapper of Plurk HTTP API.Register WCF LOB Adapter MSBuild Task: Register WCF LOB Adapter MSBuild Task 1.0: Register WCF LOB Adapter MSBuild Task Version 1.0 For more information visit: http://whiteboardworks.com/2010/02/installing-wcf-lob-adapters-wit...SCSI Interface for Multimedia and Block Devices: Release 11 - Complete User-Friendly Burning App!: I made numerous internal and external changes in this release of the program, the most important ones of which are: An intermediate buffer to make ...SharePoint LogViewer: SharePoint LogViewer 1.5.2: This release has following improvements: Scroll position is maintained when log is refreshed Filtering/Sorting performance has been significantly ...ShellLight: ShellLight 0.2.0.0: This is the first feature complete and full functional version of ShellLight. It is still a very simple framework with a limited set of features, b...Silverlight Media Player (3.0): Silverlight Media Player v.02: Silverlight Media Player (2.0/3.0/4.0) major upgrade: initial settings and media elements are stored in external XML filesStardust: Stardust Binaries: Stardust BinariesToolkit.forasp.net Multipurpose Tools for Asp.net written in C#: Beta 1: Beta 1 of csToolkit.dllToolkitVB.net is a set of Multipurpose Tools for Asp.net written in VB: Beta 1: Beta 1 of ToolKitVB.dllTransparent Persistence.Net: TP.Net 0.1.1: This is a minor update that produces separate 2.0 and 3.5 builds. Additionally type to persistence store naming has been enhanced.VCC: Latest build, v2.1.30320.0: Automatic drop of latest buildVisual Studio DSite: Screen Capture Program (Visual C++ 2008): This screen capture program can capture the whole screen of your computer and save it in any picture format you want including gif, bmp, jpg and pn...WPF Dialogs: Version 0.1.3 for .Net 3.5: This is a special version of the "Version 0.1.3" for the .Net-framework 3.5. You can use these library like the .Net 4.0 version. The changes are o...Most Popular ProjectsMetaSharpSavvy DateTimeRawrWBFS ManagerSilverlight ToolkitASP.NET Ajax LibraryMicrosoft SQL Server Product Samples: DatabaseAJAX Control ToolkitLiveUpload to FacebookWindows Presentation Foundation (WPF)Most Active ProjectsLINQ to TwitterRawrOData SDK for PHPjQuery Library for SharePoint Web ServicesDirectQPHPExcelFarseer Physics Enginepatterns & practices – Enterprise LibraryBlogEngine.NETNB_Store - Free DotNetNuke Ecommerce Catalog Module

    Read the article

  • BAM design pointers

    - by Kavitha Srinivasan
    In working recently with a large Oracle customer on SOA and BAM, I discovered that some BAM best practices are not quite well known as I had always assumed ! There is a doc bug out to formally incorporate those learnings but here are a few notes..  EMS-DO parity When using EMS (Enterprise Message Source) as a BAM feed, the best practice is to use one EMS to write to one Data Object. There is a possibility of collisions and duplicates when multiple EMS write to the same row of a DO at the same time. This customer had 17 EMS writing to one DO at the same time. Every sensor in their BPEL process writes to one topic but the Topic was read by 1 EMS corresponding to one sensor. They then used XSL within BAM to transform the payload into the BAM DO format. And hence for a given BPEL instance, 17 sensors fired, populated 1 JMS topic, was consumed by 17 EMS which in turn wrote to 1 DataObject.(You can image what would happen for later versions of the application that needs to send more information to BAM !).  We modified their design to use one Master XSL based on sensorname for all sensors relating to a DO- say Data Object 'Orders' and were able to thus reduce the 17 EMS to 1 with a master XSL. For those of you wondering about how squeaky clean this design is, you are right ! This is indeed not squeaky clean and that brings us to yet another 'inferred' best practice. (I try very hard not to state the obvious in my blogs with the hope that everytime I blog, it is very useful but this one is an exception.) Transformations and Calculations It is optimal to do transformations within an engine like BPEL. Not only does this provide modelling ease with a nice GUI XSL mapper in JDeveloper, the XSL engine in BPEL is quite efficient at runtime as well. And so, doing XSL transformations in BAM is not quite prudent.  The same is true for any non-trivial calculations as well. It is best to do all transformations,calcuations and sanitize the data in a BPEL or like layer and then send this to BAM (via JMS, WS etc.) This then delegates simply the function of report rendering and mechanics of real-time reporting to the Oracle BAM reporting tool which it is most suited to do. All nulls are not created equal Here is yet another possibly known fact but reiterated here. For an EMS with an Upsert operation: a) If Empty tags or tags with no value are sent like <Tag1/> or <Tag1></Tag1>, the DO will be overwritten with --null-- b) If Empty tags are suppressed ie not generated at all, the corresponding DO field will NOT be overwritten. The field will have whatever value existed previously.  For an EMS with an Insert operation, both tags with an empty value and no tags result in –null-- being written to the DO. Hope this helps .. Happy 4th!

    Read the article

< Previous Page | 888 889 890 891 892 893 894 895 896 897 898 899  | Next Page >