Search Results

Search found 16840 results on 674 pages for 'build'.

Page 265/674 | < Previous Page | 261 262 263 264 265 266 267 268 269 270 271 272  | Next Page >

  • ADF Faces now in Eclipse

    - by shay.shmeltzer
    The new version of Oracle Enterprise Pack for Eclipse was just release, and one of the key new feature it offers is integration of Oracle ADF Faces development in Eclipse. If you are serious about developing with JSF, you probably know by now that ADF Faces is the richest set of components out there both in terms of number of components and also the functionality they offer. The components offer a lot of Ajax functionality out of the box, and the framework also offers windowing, drag and drop, push, Javascript API, skinning and much more. OEPE makes it simple to build with ADF Faces and test run your application. Here is a basic tutorial that will get you all set up to use this combination. Once you do that, you can then do this:

    Read the article

  • DIY Weather-Aware Umbrella Stand Signals Stormy Weather

    - by Jason Fitzpatrick
    This clever DIY project adds ambient weather notification to your umbrella stand–simply walk by it on your way out the door to get a subtle reminder to take your umbrella. The clever setup involves a hobby board, motion detection, and LEDS to a rather clever end. As you walk by the semi-translucent umbrella stand all of it is mounted in, it lights up to indicate the weather conditions. Blue indicates the forecast for the day shows no sign of rain, green indicates rain, and red indicates thunderstorms. Check out the above video to see the hardware involves and the stand in action; hit up the link below for the full build guide including code. DIY Umbrella Stand Hack with Rain Alert [via Make] How To Delete, Move, or Rename Locked Files in Windows HTG Explains: Why Screen Savers Are No Longer Necessary 6 Ways Windows 8 Is More Secure Than Windows 7

    Read the article

  • SilverlightShow for Feb 14 - 20, 2011

    - by Dave Campbell
    Check out the Top Five most popular news at SilverlightShow for Feb 14 - 20, 2011. Way ahead of all other news for the week, in terms of popularity, is the news on the latest Silverlight 4 runtime update. Here are the top 5 news on SilverlightShow for last week: Silverlight 4.0.60129.0 GRD3 Runtime update KB2495644 FloatingWindow v1.2 — multi-windows interface for Silverlight 4 Silverlight MVVM Commanding II Upcoming SilverlightShow Webinar: 'Switch or no switch: Can I build my business apps in LightSwitch?' Kinect and WPF: Painting with Kinect using OpenNI Visit and bookmark SilverlightShow. Stay in the 'Light

    Read the article

  • ASP.NET Web API - Screencast series Part 4: Paging and Querying

    - by Jon Galloway
    We're continuing a six part series on ASP.NET Web API that accompanies the getting started screencast series. This is an introductory screencast series that walks through from File / New Project to some more advanced scenarios like Custom Validation and Authorization. The screencast videos are all short (3-5 minutes) and the sample code for the series is both available for download and browsable online. I did the screencasts, but the samples were written by the ASP.NET Web API team. In Part 1 we looked at what ASP.NET Web API is, why you'd care, did the File / New Project thing, and did some basic HTTP testing using browser F12 developer tools. In Part 2 we started to build up a sample that returns data from a repository in JSON format via GET methods. In Part 3, we modified data on the server using DELETE and POST methods. In Part 4, we'll extend on our simple querying methods form Part 2, adding in support for paging and querying. This part shows two approaches to querying data (paging really just being a specific querying case) - you can do it yourself using parameters passed in via querystring (as well as headers, other route parameters, cookies, etc.). You're welcome to do that if you'd like. What I think is more interesting here is that Web API actions that return IQueryable automatically support OData query syntax, making it really easy to support some common query use cases like paging and filtering. A few important things to note: This is just support for OData query syntax - you're not getting back data in OData format. The screencast demonstrates this by showing the GET methods are continuing to return the same JSON they did previously. So you don't have to "buy in" to the whole OData thing, you're just able to use the query syntax if you'd like. This isn't full OData query support - full OData query syntax includes a lot of operations and features - but it is a pretty good subset: filter, orderby, skip, and top. All you have to do to enable this OData query syntax is return an IQueryable rather than an IEnumerable. Often, that could be as simple as using the AsQueryable() extension method on your IEnumerable. Query composition support lets you layer queries intelligently. If, for instance, you had an action that showed products by category using a query in your repository, you could also support paging on top of that. The result is an expression tree that's evaluated on-demand and includes both the Web API query and the underlying query. So with all those bullet points and big words, you'd think this would be hard to hook up. Nope, all I did was change the return type from IEnumerable<Comment> to IQueryable<Comment> and convert the Get() method's IEnumerable result using the .AsQueryable() extension method. public IQueryable<Comment> GetComments() { return repository.Get().AsQueryable(); } You still need to build up the query to provide the $top and $skip on the client, but you'd need to do that regardless. Here's how that looks: $(function () { //--------------------------------------------------------- // Using Queryable to page //--------------------------------------------------------- $("#getCommentsQueryable").click(function () { viewModel.comments([]); var pageSize = $('#pageSize').val(); var pageIndex = $('#pageIndex').val(); var url = "/api/comments?$top=" + pageSize + '&$skip=' + (pageIndex * pageSize); $.getJSON(url, function (data) { // Update the Knockout model (and thus the UI) with the comments received back // from the Web API call. viewModel.comments(data); }); return false; }); }); And the neat thing is that - without any modification to our server-side code - we can modify the above jQuery call to request the comments be sorted by author: $(function () { //--------------------------------------------------------- // Using Queryable to page //--------------------------------------------------------- $("#getCommentsQueryable").click(function () { viewModel.comments([]); var pageSize = $('#pageSize').val(); var pageIndex = $('#pageIndex').val(); var url = "/api/comments?$top=" + pageSize + '&$skip=' + (pageIndex * pageSize) + '&$orderby=Author'; $.getJSON(url, function (data) { // Update the Knockout model (and thus the UI) with the comments received back // from the Web API call. viewModel.comments(data); }); return false; }); }); So if you want to make use of OData query syntax, you can. If you don't like it, you're free to hook up your filtering and paging however you think is best. Neat. In Part 5, we'll add on support for Data Annotation based validation using an Action Filter.

    Read the article

  • visual basic coach needed [closed]

    - by Danny
    0 down vote favorite I am trying to learn visual basic. I used to program in gw-basic and have trouble learning vb.net by reading and googling all the time. It takes so much time to find the answers to my programming problems and even then i do not understand the why it have to be done that way. I have beginners questions like finding childwindows using enumwindow. Then googling for hours and hours i do not seem to grasp it (must be my old age). I would like to get someone i could learn from by asking questions about what i want to program and learn from it. not to just finish the program but to learn and understand it too. Someone who dont find questions stupid to ask as i try to build my understanding of the visual basic environment. I hope to communicate by using skype voice or chat or other direct means when time permits it. Cheers, Danny

    Read the article

  • Is the new windows 8 sdk usable with visual c++ express 2010 on windows 7?

    - by JohnB
    This is inspired by and related to Is the June 2010 DX SDK really the latest? asked recently but it's a different question. I won't likely be purchasing the full visual studio 2012 for C++, I intend to use the free visual c++ express 2012 that targets desktop applications when it is released so for now I'm using visual c++ express 2010 running on windows 7. The latest directx11 sdk is the one included in the windows 8 SDK now, it's not a separate release any more. So my question is, can I use the windows 8 SDK to build directx11 programs that work on windows 7 using visual studio express 2010 running on windows 7. Or do I need to stick to the final DirectX SDK release for now?

    Read the article

  • Mobile Chrome Office Hours: Tools for Mobile Web Development

    Mobile Chrome Office Hours: Tools for Mobile Web Development Ask and vote for questions at: goo.gl Are you building for the mobile web? Are you looking for easier and better tools to help you create great experiences? Join Boris Smus and Pete LePage as they show you some of the many tools available to mobile web developers. We'll take a look Chrome's remote debugging features, some of the emulation tools available to you within Chrome and take a deep dive into some of the advanced use cases of these tools to help you build for the mobile web. From: GoogleDevelopers Views: 1432 60 ratings Time: 42:16 More in Science & Technology

    Read the article

  • Is it smart to give idea to company?

    - by Ryan
    I have a few interesting ideas (business ideas) that can be implemented as add-on features of existing business products (web-based products, mostly startups). Based on experience, can anyone let me know if telling them would be a good idea or not? I'm hoping to get some feedback from both sides (company insiders and outsiders). The upside: They could like it and think about bringing me on board to help build. The downside: They take the idea and implement it. Company size for one is less than 50. Another is less than 25.

    Read the article

  • In Scrum, should tasks such as development environment set-up and capability development be managed as subtasks within actual user stories?

    - by Asim Ghaffar
    Sometimes in projects we need to spend time on tasks such as: exploring alternate frameworks and tools learning the framework and tools selected for the project setting up the servers and project infrastructure (version control, build environments, databases, etc) If we are using User Stories, where should all this work go? One option is to make them all part of first user story (e.g. make the homepage for application). Another option is to do a spike for these tasks. A third option is to make task part of an Issue/Impediment (e.g. development environment not selected yet) rather than a user Story.

    Read the article

  • Microsoft Technical Computing

    In the past I have described the team I belong to here at Microsoft (Parallel Computing Platform) in terms of contributing to Visual Studio and related products, e.g. .NET Framework. To be more precise, our team is part of the Technical Computing group, which is still part of the Developer Division. This was officially announced externally earlier this month in an exec email (from Bob Muglia, the president of STB, to which DevDiv belongs). Here is an extract: " As we build the Technical...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Applicability of the Joel Test to web development companies

    - by dreftymac
    QUESTION: How can you re-write the questions of the Joel test to apply to web developers? 1. Do you use source control? (source control for all aspects of your app, including configuration, database and user-based settings?) 2. Can you make a build in one step? (can you deploy a site from staging to prod in 1 step?) ... 10. Do you have testers? (how do you test AJAX and CSS?) BACKGROUND: This is for people who work in a shop that does some web development but also uses some off-the-shelf tools like Drupal and Wordpress, but doing custom development on top of that. RELATED LINKS: http://www.joelonsoftware.com/articles/fog0000000043.html What do you think about the Joel Test?

    Read the article

  • SNEAK PEEK: New Silverlight application themes

    Twas the week before MIX, when all through the tubes Not a developer was sleeping, not even the noobs. The laptops were paved removed of their glitz In hopes that they soon will get some new bits. A developer was coding, building an app Trying to build the next greatest XAP Battleship gray?! Now thats obscene Check our designers latest theme Okay, so Im not going to win any poetry awards. Our UX design team for Silverlight has been thinking about app building a lot this past year,...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • How does Google maintain its codes?

    - by John Maxim
    Pagerank algorithm is not revealed to any of their associates programmers, but only accessible by Larry Page or maybe Sergey Brin. I wonder how do they go about managing their coding? There are times when you need to build something up and you may need more hands to help with coding, but you may also want to keep some secrets to yourself, I'm not saying I have secrets, but I wonder how do they manage their coding. I'm sure there are some ways to do it decently and professionally. The reason why friendster failed was because one of the factors they lost control over their coding part. I think this is an interesting question. But not easy to answer, maybe only a marginal knew.

    Read the article

  • SQL SERVER – SSIS Parameters in Parent-Child ETL Architectures – Notes from the Field #040

    - by Pinal Dave
    [Notes from Pinal]: SSIS is very well explored subject, however, there are so many interesting elements when we read, we learn something new. A similar concept has been Parent-Child ETL architecture’s relationship in SSIS. Linchpin People are database coaches and wellness experts for a data driven world. In this 40th episode of the Notes from the Fields series database expert Tim Mitchell (partner at Linchpin People) shares very interesting conversation related to how to understand SSIS Parameters in Parent-Child ETL Architectures. In this brief Notes from the Field post, I will review the use of SSIS parameters in parent-child ETL architectures. A very common design pattern used in SQL Server Integration Services is one I call the parent-child pattern.  Simply put, this is a pattern in which packages are executed by other packages.  An ETL infrastructure built using small, single-purpose packages is very often easier to develop, debug, and troubleshoot than large, monolithic packages.  For a more in-depth look at parent-child architectures, check out my earlier blog post on this topic. When using the parent-child design pattern, you will frequently need to pass values from the calling (parent) package to the called (child) package.  In older versions of SSIS, this process was possible but not necessarily simple.  When using SSIS 2005 or 2008, or even when using SSIS 2012 or 2014 in package deployment mode, you would have to create package configurations to pass values from parent to child packages.  Package configurations, while effective, were not the easiest tool to work with.  Fortunately, starting with SSIS in SQL Server 2012, you can now use package parameters for this purpose. In the example I will use for this demonstration, I’ll create two packages: one intended for use as a child package, and the other configured to execute said child package.  In the parent package I’m going to build a for each loop container in SSIS, and use package parameters to pass in a value – specifically, a ClientID – for each iteration of the loop.  The child package will be executed from within the for each loop, and will create one output file for each client, with the source query and filename dependent on the ClientID received from the parent package. Configuring the Child and Parent Packages When you create a new package, you’ll see the Parameters tab at the package level.  Clicking over to that tab allows you to add, edit, or delete package parameters. As shown above, the sample package has two parameters.  Note that I’ve set the name, data type, and default value for each of these.  Also note the column entitled Required: this allows me to specify whether the parameter value is optional (the default behavior) or required for package execution.  In this example, I have one parameter that is required, and the other is not. Let’s shift over to the parent package briefly, and demonstrate how to supply values to these parameters in the child package.  Using the execute package task, you can easily map variable values in the parent package to parameters in the child package. The execute package task in the parent package, shown above, has the variable vThisClient from the parent package mapped to the pClientID parameter shown earlier in the child package.  Note that there is no value mapped to the child package parameter named pOutputFolder.  Since this parameter has the Required property set to False, we don’t have to specify a value for it, which will cause that parameter to use the default value we supplied when designing the child pacakge. The last step in the parent package is to create the for each loop container I mentioned earlier, and place the execute package task inside it.  I’m using an object variable to store the distinct client ID values, and I use that as the iterator for the loop (I describe how to do this more in depth here).  For each iteration of the loop, a different client ID value will be passed into the child package parameter. The final step is to configure the child package to actually do something meaningful with the parameter values passed into it.  In this case, I’ve modified the OleDB source query to use the pClientID value in the WHERE clause of the query to restrict results for each iteration to a single client’s data.  Additionally, I’ll use both the pClientID and pOutputFolder parameters to dynamically build the output filename. As shown, the pClientID is used in the WHERE clause, so we only get the current client’s invoices for each iteration of the loop. For the flat file connection, I’m setting the Connection String property using an expression that engages both of the parameters for this package, as shown above. Parting Thoughts There are many uses for package parameters beyond a simple parent-child design pattern.  For example, you can create standalone packages (those not intended to be used as a child package) and still use parameters.  Parameter values may be supplied to a package directly at runtime by a SQL Server Agent job, through the command line (via dtexec.exe), or through T-SQL. Also, you can also have project parameters as well as package parameters.  Project parameters work in much the same way as package parameters, but the parameters apply to all packages in a project, not just a single package. Conclusion Of the numerous advantages of using catalog deployment model in SSIS 2012 and beyond, package parameters are near the top of the list.  Parameters allow you to easily share values from parent to child packages, enabling more dynamic behavior and better code encapsulation. If you want me to take a look at your server and its settings, or if your server is facing any issue we can Fix Your SQL Server. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: Notes from the Field, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • OTN Database Developer Day in LA/OC

    - by shay.shmeltzer
    We are taking a little break from the Fusion OTN Developer Days, and instead we'll be taking part in several OTN Developer Days ran by the database team. The aim is to show what Oracle has to offer to various developer groups. As you might guess we specifically are going to be in the Java track. Specifically we are running a lab that will get you to experience Oracle JDeveloper (or OEPE) and will show you how to build an application based on EJB/JSF with Ajax UI. I'm going to be in the upcoming event on May 5th - if you are in the LA area and haven't experienced JDeveloper yet - come in and see what it is all about. Details here.

    Read the article

  • WCF/ADO.NET Data Services - Could not load type 'System.Data.Services.Providers.IDataServiceUpdatePr

    - by Sahil Malik
    Ad:: SharePoint 2007 Training in .NET 3.5 technologies (more information). When you try accessing ListData.svc, do you get the following error? Could not load type 'System.Data.Services.Providers.IDataServiceUpdateProvider' from assembly 'System.Data.Services, Version=3.5.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089'. Well, if you followed the instructions in Chapter 1 of my book to build your VM, you wouldn’t run into the above issue. But if you do, you need to install  -   For Windows Vista and Windows 2008 - http://www.microsoft.com/downloads/details.aspx?familyid=4B710B89-8576-46CF-A4BF-331A9306D555&displaylang=en For Windows 7 and Windows 2008 R2 - http://www.microsoft.com/downloads/details.aspx?familyid=79d7f6f8-d6e9-4b8c-8640-17f89452148e&displaylang=en Remember to: a) Install the x64 version, and b) Do an IISReset before trying again. Comment on the article ....

    Read the article

  • Spotlight on Claims: Serving Customers Under Extreme Conditions

    - by [email protected]
    Oracle Insurance's director of marketing for EMEA, John Sinclair, recently attended the CII Spotlight on Claims event in London. Bad weather and its implications for the insurance industry have become very topical as the frequency and diversity of natural disasters - including rains, wind and snow - has surged across Europe this winter. On England's wettest day on record, the county of Cumbria was flooded with 12 inches of rain within 24 hours. Freezing temperatures wreaked havoc on European travel, causing high speed TVG trains to break down and stranding hundreds of passengers under the English Chanel in a tunnel all night long without heat or electricity. A storm named Xynthia thrashed France and surrounding countries with hurricane force, flooding ports and killing 51 people. After the Spring Equinox, insurers may have thought the worst had past. Then came along Eyjafjallajökull, spewing out vast quantities of volcanic ash in what is turning out to be one of most costly natural disasters in history. Such extreme events challenge insurance companies' ability to service their customers just when customers need their help most. When you add economic downturn and competitive pressures to the mix, insurers are further stretched and required to continually learn and innovate to meet high customer expectations with reduced budgets. These and other issues were hot topics of discussion at the recent "Spotlight on Claims" seminar in London, focused on how weather is affecting claims and the insurance industry. The event was organized by the CII (Chartered Insurance Institute), a group with 90,000 members. CII has been at the forefront in setting professional standards for the insurance industry for over a century. Insurers came to the conference to hear how they could better serve their customers under extreme weather conditions, learn from the experience of their peers, and hear about technological breakthroughs in climate modeling, geographic intelligence and IT. Customer case studies at the conference highlighted the importance of effective and constant communication in handling the overflow of catastrophe related claims. First and foremost is the need to rapidly establish initial communication with claimants to build their confidence in a positive outcome. Ongoing communication then needs to be continued throughout the claims cycle to mange expectations and maintain ownership of the process from start to finish. Strong internal communication to support frontline staff was also deemed critical to successful crisis management, as was communication with the broader insurance ecosystem to tap into extended resources and business intelligence. Advances in technology - such web based systems to access policies and enter first notice of loss in the field - as well as customer-focused self-service portals and multichannel alerts, are instrumental in improving customer satisfaction and helping insurers to deal with the claims surge, which often can reach four or more times normal workloads. Dynamic models of the global climate system can now be used to better understand weather-related risks, and as these models mature it is hoped that they will soon become more accurate in predicting the timing of catastrophic events. Geographic intelligence is also being used within a claims environment to better assess loss reserves and detect fraud. Despite these advances in dealing with catastrophes and predicting their occurrence, there will never be a substitute for qualified front line staff to deal with customers. In light of pressures to streamline efficiency, there was debate as to whether outsourcing was the solution, or whether it was better to build on the people you have. In the final analysis, nearly everybody agreed that in the future insurance companies would have to work better and smarter to keep on top. An appeal was also made for greater collaboration amongst industry participants in dealing with the extreme conditions and systematic stress brought on by natural disasters. It was pointed out that the public oftentimes judged the industry as a whole rather than the individual carriers when it comes to freakish events, and that all would benefit at such times from the pooling of limited resources and professional skills rather than competing in silos for competitive advantage - especially the end customer. One case study that stood out was on how The Motorists Insurance Group was able to power through one of the most devastating catastrophes in recent years - Hurricane Ike. The keys to Motorists' success were superior people, processes and technology. They did a lot of upfront planning and invested in their people, creating a healthy team environment that delivered "max service" even when they were experiencing the same level of devastation as the rest of the population. Processes were rapidly adapted to meet the challenge of the catastrophe and continually adapted to Ike's specific conditions as they evolved. Technology was fundamental to the execution of their strategy, enabling them anywhere access, on the fly reassigning of resources and rapid training to augment the work force. You can learn more about the Motorists experience by watching this video. John Sinclair is marketing director for Oracle Insurance in EMEA. He has more than 20 years of experience in insurance and financial services.

    Read the article

  • Xamarin Designer for Android Webinar - Recording

    - by Wallym
    Here is some info on the recording of the webinar that I did last week for AppDev regarding the Xamarin Designer for Android.Basic Info: Android user interfaces can be created declaratively by using XML files, or programmatically in code. The Xamarin Android Designer allows developers to create and modify declarative layouts visually, without having to deal with the tedium of hand-editing XML files. The designer also provides real-time feedback, which lets the developer validate changes without having to redeploy the application in order to test a design. This can speed up UI development in Android tremendously. In this webinar, we'll take a look at UI Design in Mono for Android, the basics of the Xamarin Android Designer, and build a simple application with the designer.Here is the link:http://media.appdev.com/EDGE/LL/livelearn05232012.wmvI think it will only play in Internet Explorer.  Enjoy!

    Read the article

  • Help! Online upgrade from 12.04 to 14.04 stuck

    - by Luis
    I was trying to upgrade my Lenovo T500 laptop from Kubuntu 12.04 LTS to Kubuntu 14.04 LTS. Fired up the upgrade process, and finally after downloading a zillion packages the upgrade got going, only to get stuck... It has been stuck for hours on: Installing the upgrades->Unpacking subversion Last lines of error messages: GLib-GObject-CRITICAL **: /build/buildd/glib2.0-2.32.4/./gobject/gtype.c:2722: You forgot to call g_type_init() at /usr/lib/perl/5.18/DynaLoader.pm line 207. GLib-CRITICAL **: g_once_init_leave: assertion `result != 0' failed at /usr/lib/perl/5.18/DynaLoader.pm line 207. I don't care much about subversion anymore, I use git now; I don't care if subversion ends up in a bad state and I have to remove it... I just want the upgrade to continue, and hopefully complete. Any ideas???? Thanks, Luis

    Read the article

  • Is traditional JavaScript image pre-loading taboo

    - by Evan Plaice
    I remember the good-old-days (not really) back when I was still sucking the teet of Dreamweaver to build websites and the lure of playing copypasta with fancy built-in scripts (ex, image-swap) was like black magic. I'm pretty far removed from that now days but I was adapting a small site from it's original FrontPage (::cringe::) format to a standard HTML/CSS implementation and couldn't help wondering... should I should re-implement the JavaScript image pre-loading into the current version? Or, is there a better way? I don't want to block the page from loading by requiring the user to request all the assets withing the page by using the traditional JavaScript pre-loader method. I value giving the user something to look at ASAP, and there's some potential harm to my Google mojo by doing so. Is there a cleaner solution to prevent unnecessary page-reflows during loading? Such as, setting the static width/height dimensions through a CSS style attribute on the image element.

    Read the article

  • Best way of Javascript web development in Netbeans (Hot deployment)

    - by marcelocbf
    I'm beginning Javascript development and as a beginner in JavaScript I make a lot of mistakes. The way I'm developing is very counter-productive because every mistake I fix I have to shutdown Glassfish, re-build the app and re-deploy it. My app is a Java back-end with REST services and the Html, JavaScript, CSS for the frontend. Everything is packed in a .ear file. As of right now, I'm just working with the frontend but I do have to make this whole process to update the files. My question is ... is there a better way of doing this? Can somebody tell how do you guys work in a similar setup to do the everyday development?

    Read the article

  • A Simple Online Document Management System Using Asp.net MVC 3

    - by RazanPaul
    Nowadays we have a number of online file management systems (e.g. DropBox, SkyDrive and Google Drive). People can use them to manage different types of documents. However, one might need a system to manage documents when they do not want to publish the company documents to the cloud. In that case, they need to build an online document management system. This project is intended to meet this purpose. However, it is in the early stage. All the functionalities seem working. A lot of work is needed in the UI. Besides this, code needs refactoring. Please find the project at the following link: https://documentmanagementsystem.codeplex.com/

    Read the article

  • How to perform regular expression based replacements on files with MSBuild

    - by Daniel Cazzulino
    And without a custom DLL with a task, too . The example at the bottom of the MSDN page on MSBuild Inline Tasks already provides pretty much all you need for that with a TokenReplace task that receives a file path, a token and a replacement and uses string.Replace with that. Similar in spirit but way more useful in its implementation is the RegexTransform in NuGet’s Build.tasks. It’s much better not only because it supports full regular expressions, but also because it receives items, which makes it very amenable to batching (applying the transforms to multiple items). You can read about how to use it for updating assemblies with a version number, for example. I recently had a need to also supply RegexOptions to the task so I extended the metadata and a little bit of the inline task so that it can parse the optional flags. So when using the task, I can pass the flags as item metadata as follows:...Read full article

    Read the article

  • Trash Destination Adapter

    The Trash Destination and this article came from early experiences of using SSIS and community feedback at the time. When developing a package it is very useful to have a destination adapter that does nothing but consume rows with no setup requirement. You often want run a package part way through development, or just add a path so you can set a Data Viewer. There are stock tasks that can be used, but with the Trash Destination all columns are treated as selected automatically (usage type of read-only), so the pipeline knows they are required. It is also obvious that this is for development or diagnostic purposes, and is clearly not a part of the functional design of the package. It is also ideal for just playing around and exploring concepts in SSIS, and is often used in conjunction with the Data Generator Source. Using these two components it is easy to setup a test of an expression in the Derived Column Transformation for example. The Data Generator Source provides some dummy data, and the Trash Destination allows you to anchor the output path and set a Data Viewer to examine the results. It can also be used when performance tuning packages. It is a consistent and known quantity that has no external influences, so it is ideal as a destination when breaking the data flow into sections to isolate a bottleneck. The adapter is really simple to use and requires no setup. Simply drop it onto the pipeline designer and use it to terminate your data flow path. Installation The component is provided as an MSI file which you can download and run to install it. This simply places the files on disk in the correct locations and also installs the assemblies in the Global Assembly Cache as per Microsoft’s recommendations. You may need to restart the SQL Server Integration Services service, as this caches information about what components are installed, as well as restarting any open instances of Business Intelligence Development Studio (BIDS) / Visual Studio that you may be using to build your SSIS packages. Finally, for 2005/2008, you will have to add the transformation to the Visual Studio toolbox manually. Right-click the toolbox, and select Choose Items.... Select the SSIS Data Flow Items tab, and then check the Trash Destination transformation in the Choose Toolbox Items window. This process has been described in detail in the related FAQ entry for How do I install a task or transform component? We recommend you follow best practice and apply the current Microsoft SQL Server Service pack to your SQL Server servers and workstations. Downloads The Trash Destination is available for SQL Server 2005, SQL Server 2008 (includes R2) and SQL Server 2012. Please choose the version to match your SQL Server version, or you can install multiple versions and use them side by side if you have more than one version of SQL Server installed. Trash Destination for SQL Server 2005 Trash Destination for SQL Server 2008 Trash Destination for SQL Server 2012 Version History SQL Server 2012 Version 3.0.0.34 - SQL Server 2012 release. Includes upgrade support for both 2005 and 2008 packages to 2012. (5 Jun 2012) SQL Server 2008 Version 2.0.0.33 - SQL Server 2008 release. Includes support for upgrade of 2005 packages. RTM compatible, previously February 2008 CTP. (4 Mar 2008) Version 2.0.0.31 - SQL Server 2008 November 2007 CTP. (14 Feb 2008) SQL Server 2005 Version 1.0.2.18 - SQL Server 2005 RTM Refresh. SP1 Compatibility Testing. (12 Jun 2006) Version 1.0.1.1 - SQL Server 2005 IDW 15 June CTP. Minor enhancements over v1.0.1.0. (11 Jun 2005) Version 1.0.1.0 - SQL Server 2005 IDW 14 April CTP. First Public Release. (30 May 2005) Troubleshooting Make sure you have downloaded the version that matches your version of SQL Server. We offer separate downloads for SQL Server 2005, SQL Server 2008 and SQL Server 2012. If you an error when you try and use the component along the lines of The component could not be added to the Data Flow task. Please verify that this component is properly installed.  ... The data flow object "Konesans ..." is not installed correctly on this computer, this usually indicates that the internal cache of SSIS components needs to be updated. This is held by the SSIS service, so you need restart the the SQL Server Integration Services service. You can do this from the Services applet in Control Panel or Administrative Tools in Windows. You can also restart the computer if you prefer. You may also need to restart any current instances of Business Intelligence Development Studio (BIDS) / Visual Studio that you may be using to build your SSIS packages. The full error message is shown below for reference: TITLE: Microsoft Visual Studio ------------------------------ The component could not be added to the Data Flow task. Please verify that this component is properly installed. ------------------------------ ADDITIONAL INFORMATION: The data flow object "Konesans.Dts.Pipeline.TrashDestination.Trash, Konesans.Dts.Pipeline.TrashDestination, Version=1.0.1.0, Culture=neutral, PublicKeyToken=b8351fe7752642cc" is not installed correctly on this computer. (Microsoft.DataTransformationServices.Design) For 2005/2008, once installation is complete you need to manually add the task to the toolbox before you will see it and to be able add it to packages - How do I install a task or transform component? This is not necessary for SQL Server 2012 as the new SSIS toolbox automatically detects components. If you are still having issues then contact us, but please provide as much detail as possible about error, as well as which version of the the task you are using and details of the SSIS tools installed.

    Read the article

  • New games for 2011

    - by Oli
    I know there have been other questions like "What native games are available?" and they often have issues because they turn into a never-ending list of every game ever released for Linux. But I'd like to know what's coming out this year. Good answers can include: A game that's coming out in 2011 A Linux port being released for games that might be older (eg Trine) As much information and as many screenshots and links as possible Few old games unless they're doing a major update that changes the game very significantly. One game per answer, add as much information as possible and work with each other to build a catalogue of awesome things to look forward to this year.

    Read the article

< Previous Page | 261 262 263 264 265 266 267 268 269 270 271 272  | Next Page >