Search Results

Search found 27337 results on 1094 pages for 't sql'.

Page 1018/1094 | < Previous Page | 1014 1015 1016 1017 1018 1019 1020 1021 1022 1023 1024 1025  | Next Page >

  • How to store Role Based Access rights in web application?

    - by JonH
    Currently working on a web based CRM type system that deals with various Modules such as Companies, Contacts, Projects, Sub Projects, etc. A typical CRM type system (asp.net web form, C#, SQL Server backend). We plan to implement role based security so that basically a user can have one or more roles. Roles would be broken down by first the module type such as: -Company -Contact And then by the actions for that module for instance each module would end up with a table such as this: Role1 Example: Module Create Edit Delete View Company Yes Owner Only No Yes Contact Yes Yes Yes Yes In the above case Role1 has two module types (Company, and Contact). For company, the person assigned to this role can create companies, can view companies, can only edit records he/she created and cannot delete. For this same role for the module contact this user can create contacts, edit contacts, delete contacts, and view contacts (full rights basically). I am wondering is it best upon coming into the system to session the user's role with something like a: List<Role> roles; Where the Role class would have some sort of List<Module> modules; (can contain Company, Contact, etc.).? Something to the effect of: class Role{ string name; string desc; List<Module> modules; } And the module action class would have a set of actions (Create, Edit, Delete, etc.) for each module: class ModuleActions{ List<Action> actions; } And the action has a value of whether the user can perform the right: class Action{ string right; } Just a rough idea, I know the action could be an enum and the ModuleAction can probably be eliminated with a List<x, y>. My main question is what would be the best way to store this information in this type of application: Should I store it in the User Session state (I have a session class where I manage things related to the user). I generally load this during the initial loading of the application (global.asax). I can simply tack onto this session. Or should this be loaded at the page load event of each module (page load of company etc..). I eventually need to be able to hide / unhide various buttons / divs based on the user's role and that is what got me thinking to load this via session. Any examples or points would be great.

    Read the article

  • Find a Faster DNS Server with Namebench

    - by Mysticgeek
    One way to speed up your Internet browsing experience is using a faster DNS server. Today we take a look at Namebench, which will compare your current DNS server against others out there, and help you find a faster one. Namebench Download the file and run the executable (link below). Namebench starts up and will include the current DNS server you have configured on your system. In this example we’re behind a router and using the DNS server from the ISP. Include the global DNS providers and the best available regional DNS server, then start the Benchmark. The test starts to run and you’ll see the queries it’s running through. The benchmark takes about 5-10 minutes to complete. After it’s complete you’ll get a report of the results. Based on its findings, it will show you what DNS server is fastest for your system. It also displays different types of graphs so you can get a better feel for the different results. You can export the results to a .csv file as well so you can present the results in Excel. Conclusion This is a free project that is in continuing development, so results might not be perfect, and there may be more features added in the future. If you’re looking for a method to help find a faster DNS server for your system, Namebench is a cool free utility to help you out. If you’re looking for a public DNS server that is customizable and includes filters, you might want to check out our article on helping to protect your kids from questionable content using OpenDNS. You can also check out how to speed up your web browsing with Google Public DNS. Links Download NameBench for Windows, Mac, and Linux from Google Code Learn More About the Project on the Namebench Wiki Page Similar Articles Productive Geek Tips Open a Second Console Session on Ubuntu ServerShare Ubuntu Home Directories using SambaSetup OpenSSH Server on Ubuntu LinuxDisable the Annoying “This device can perform faster” Balloon Message in Windows 7Search For Rows With Special Characters in SQL Server TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 How to Add Exceptions to the Windows Firewall Office 2010 reviewed in depth by Ed Bott FoxClocks adds World Times in your Statusbar (Firefox) Have Fun Editing Photo Editing with Citrify Outlook Connector Upgrade Error Gadfly is a cool Twitter/Silverlight app

    Read the article

  • I'm creating my own scalable, rapid prototyping web server. How should I design it?

    - by Mike Willliams
    I'm going to create my own web server that focuses on scalability, rapid prototyping and the use of JavaScript as the server's scripting language, much like node.js. It will use a Model-View-Controller design pattern so a web application can support more concurrent users just by adding hardware -- and not having to redesign the software. Basically, I'm aiming to produce a framework that allows for fast and easy development of cloud applications without the need to write lots of boiler plate code. I've got some questions about this... How hard will it be to put MySQL in the cloud? How could I go about implementing this and make the resulting product free? Will I have to write my own engine or modify an existing one, if I do what should I watch out for? To make this scalable I need to adjust from one server to hundreds of servers this creates the requirement for the servers to be load balancing, how should I do this? If I balance based on the work load per server I would need gateway to handle all the incoming requests. Is it the right idea to have all the servers check into the gateway and update there status. By having the servers run through a gateway if the gateway dies all the incoming requests are ignored. I'm thinking that having all the servers maintain a list of each other, or at least a few I could rebuild the list of servers and establish a new gateway. Is it worth it? Or should I have a backup gateway that could switch out? Should I let the user choose? How should I pick which server handles the database and which handles the page serving? Should I spread the database so that queries are preformed on multiple servers? Which would theoretically improve performance. The servers would need to mirror the database at least once so that if a server goes down the database isn't corrupted. So this brings up writing another question, should I broadcast SQL queries so that all the servers can take a bit of the work load? If I do it that way wouldn't a query clog up the network so that other queries couldn't be preformed? What are my alternatives? Finally, is there a free solution already out there that might need a little modification that suits my needs?

    Read the article

  • Announcement: Employee Info Starter Kit (v6.0–ASP.NET MVC Edition) is Released

    - by Mohammad Ashraful Alam
    Originally posted on: http://geekswithblogs.net/joycsharp/archive/2013/06/16/announcement-employee-info-starter-kit-v6.0asp.net-mvc-edition-is-released.aspxAfter a long wait, the next version of Employee Info Starter Kit is released! This starter kit is basically a project template that contains code samples targeting a specific technology, such as ASP.NET Web Form, ASP.NET MVC etc. Since its first release, this open source project gained a huge popularity in the developer community and had 250K+ combined downloads. This starter kit is honored to be placed at the official ASP.NET site, along with other asp.net starter kits, which all are being considered as the “best” ASP.NET coding standards, recommended by Microsoft. EISK is showcased in Microsoft’s Channel 9’s Weekly Show, as well. The ASP.NET MVC Edition of the new version 6.0 bundles most of the greatest and successful platforms, frameworks and technologies together, to enable web developers to learn and build manageable and high performance web applications with rich user experience effectively and quickly. User End Specifications Creating a new employee record Read existing employee records Update an existing employee record Delete existing employee records Role based security model Key Technology Areas ASP.NET MVC 4 Entity Framework 4.3.1 Sql Server Compact Edition 4 Visual Studio 2012 QuickStart Guide Getting started with EISK 6.0 ASP.NET is pretty easy. Once you've Visual Studio 2012 installed, then just follow the steps as provided below: Download the EISK 6.0 MVC version. Extract the file. From the extracted folder, click the solution file "Eisk.MVC-VS2012.sln". Right click the "Eisk.MVC" project node and select "Select set as StartUp Project". Hit Ctrl+F5 and explore! Architectural Overview Overall architecture is based on Model-View-Controller pattern Support for desktop & mobile browsers. Usage of Domain Model, Repository and Unit of Work pattern from Domain Driven Development approach Usage of Data Annotations in model (entity) classes to centralize basic validation mechanism that facilitates DRY principle Usage of IValidatableObject interface in model (entity) classes that isolates custom business logic from application layer Usage of OOP inheritance and Value Object pattern in model (entity) classes that provides reusability in application architecture Usage of View Model, Editor Model pattern that provides mechanism for testable view rendering logic Several helper classes and extension methods to enable developers build application with reduced code If you want to learn more about it in details, just check the following links: Getting Started - Hands on Coding Walkthrough – Technology Stack - Design & Architecture Enjoy!

    Read the article

  • How I might think like a hacker so that I can anticipate security vulnerabilities in .NET or Java before a hacker hands me my hat [closed]

    - by Matthew Patrick Cashatt
    Premise I make a living developing web-based applications for all form-factors (mobile, tablet, laptop, etc). I make heavy use of SOA, and send and receive most data as JSON objects. Although most of my work is completed on the .NET or Java stacks, I am also recently delving into Node.js. This new stack has got me thinking that I know reasonably well how to secure applications using known facilities of .NET and Java, but I am woefully ignorant when it comes to best practices or, more importantly, the driving motivation behind the best practices. You see, as I gain more prominent clientele, I need to be able to assure them that their applications are secure and, in order to do that, I feel that I should learn to think like a malevolent hacker. What motivates a malevolent hacker: What is their prime mover? What is it that they are most after? Ultimately, the answer is money or notoriety I am sure, but I think it would be good to understand the nuanced motivators that lead to those ends: credit card numbers, damning information, corporate espionage, shutting down a highly visible site, etc. As an extension of question #1--but more specific--what are the things most likely to be seeked out by a hacker in almost any application? Passwords? Financial info? Profile data that will gain them access to other applications a user has joined? Let me be clear here. This is not judgement for or against the aforementioned motivations because that is not the goal of this post. I simply want to know what motivates a hacker regardless of our individual judgement. What are some heuristics followed to accomplish hacker goals? Ultimately specific processes would be great to know; however, in order to think like a hacker, I would really value your comments on the broader heuristics followed. For example: "A hacker always looks first for the low-hanging fruit such as http spoofing" or "In the absence of a CAPTCHA or other deterrent, a hacker will likely run a cracking script against a login prompt and then go from there." Possibly, "A hacker will try and attack a site via Foo (browser) first as it is known for Bar vulnerability. What are the most common hacks employed when following the common heuristics? Specifics here. Http spoofing, password cracking, SQL injection, etc. Disclaimer I am not a hacker, nor am I judging hackers (Heck--I even respect their ingenuity). I simply want to learn how I might think like a hacker so that I may begin to anticipate vulnerabilities before .NET or Java hands me a way to defend against them after the fact.

    Read the article

  • Dynamic Filtering

    - by Ricardo Peres
    Continuing my previous posts on dynamic LINQ, now it's time for dynamic filtering. For now, I'll focus on string matching. There are three standard operators for string matching, which both NHibernate, Entity Framework and LINQ to SQL recognize: Equals Contains StartsWith EndsWith So, if we want to apply filtering by one of these operators on a string property, we can use this code: public enum MatchType { StartsWith = 0, EndsWith = 1, Contains = 2, Equals = 3 } public static List Filter(IEnumerable enumerable, String propertyName, String filter, MatchType matchType) { return (Filter(enumerable, typeof(T), propertyName, filter, matchType) as List); } public static IList Filter(IEnumerable enumerable, Type elementType, String propertyName, String filter, MatchType matchType) { MethodInfo asQueryableMethod = typeof(Queryable).GetMethods(BindingFlags.Static | BindingFlags.Public).Where(m = (m.Name == "AsQueryable") && (m.ContainsGenericParameters == false)).Single(); IQueryable query = (enumerable is IQueryable) ? (enumerable as IQueryable) : asQueryableMethod.Invoke(null, new Object [] { enumerable }) as IQueryable; MethodInfo whereMethod = typeof(Queryable).GetMethods(BindingFlags.Public | BindingFlags.Static).Where(m = m.Name == "Where").ToArray() [ 0 ].MakeGenericMethod(elementType); MethodInfo matchMethod = typeof(String).GetMethod ( (matchType == MatchType.StartsWith) ? "StartsWith" : (matchType == MatchType.EndsWith) ? "EndsWith" : (matchType == MatchType.Contains) ? "Contains" : "Equals", new Type [] { typeof(String) } ); PropertyInfo displayProperty = elementType.GetProperty(propertyName, BindingFlags.Public | BindingFlags.Instance); MemberExpression member = Expression.MakeMemberAccess(Expression.Parameter(elementType, "n"), displayProperty); MethodCallExpression call = Expression.Call(member, matchMethod, Expression.Constant(filter)); LambdaExpression where = Expression.Lambda(call, member.Expression as ParameterExpression); query = whereMethod.Invoke(null, new Object [] { query, where }) as IQueryable; MethodInfo toListMethod = typeof(Enumerable).GetMethod("ToList", BindingFlags.Static | BindingFlags.Public).MakeGenericMethod(elementType); IList list = toListMethod.Invoke(null, new Object [] { query }) as IList; return (list); } var list = new [] { new { A = "aa" }, new { A = "aabb" }, new { A = "ccaa" }, new { A = "ddaadd" } }; var contains = Filter(list, "A", "aa", MatchType.Contains); var endsWith = Filter(list, "A", "aa", MatchType.EndsWith); var startsWith = Filter(list, "A", "aa", MatchType.StartsWith); var equals = Filter(list, "A", "aa", MatchType.Equals); Perhaps I'll write some more posts on this subject in the near future. SyntaxHighlighter.config.clipboardSwf = 'http://alexgorbatchev.com/pub/sh/2.0.320/scripts/clipboard.swf'; SyntaxHighlighter.brushes.CSharp.aliases = ['c#', 'c-sharp', 'csharp']; SyntaxHighlighter.all();

    Read the article

  • Tap into MySQL's Amazing Performance Results with the Performance Tuning Course

    - by Antoinette O'Sullivan
    Want to leverage the high-speed load utilities, distinctive memory caches, full text indexes, and other performance-enhancing mechanisms that MySQL offers to fuel today's critical business systems. The authentic MySQL Performance Tuning course, in 4 days, teaches you to evaluate the MySQL architecture, learn to use the tools, configure the database for performance, tune application and SQL code, tune the server, examine the storage engines, assess the application architecture, and learn general tuning concepts. You can take this course in one the following three ways: Training-on-Demand: Access the streaming video, instructor delivery of this course from your own desk, at your own pace. Book time for hands-on practice when it suits you. Live-Virtual Class: Take this instructor-led class live from your own desk. With 700 events on the schedule you are sure to find a time and date to suit you! In-Class: Travel to a classroom to take this class. A sample of events on the schedule are as follows.  Location  Date  Delivery Language  Hamburg, Germany  22 October 2012  German  Prague, Czech Republic  1 October 2012  Czech  Warsaw, Poland  3 December 2012  Polish  London, England  19 November 2012  English  Rome, Italy  23 October 2012  Italian Lisbon, Portugal  6 November 2012  European Portugese  Aix en Provence, France  4 September 2012   French  Strasbourg, France 16 October 2012   French  Nieuwegein, Netherlands 26 November 2012   Dutch  Madrid, Spain 17 December 2012   Spanish  Mechelen, Belgium  1 October 2012  English  Riga, Latvia  10 December 2012  Latvian  Petaling Jaya, Malaysia  10 September 2012 English   Edmonton, Canada 10 December 2012   English  Vancouver, Canada 10 December 2012   English  Ottawa, Canada 26 November 2012   English  Toronto, Canada 26 November 2012   English  Montreal, Canada 26 November 2012   English  Mexico City, Mexico 10 September 2012   Spanish  Sao Paolo, Brazil 26 November 2012  Brazilian Portugese   Tokyo, Japan 19 November 2012   Japanese  Tokyo, Japan  19 November 2012  Japanese For further information on this class, or to register your interest in additional events, go to the Oracle University Portal: http://oracle.com/education/mysql

    Read the article

  • Stagnating in programming

    - by Coder
    Time after time this question came up in my mind, but up until today I wasn't thinking about it much. I have been programming for maybe around 8 years now, and for the last two years it seems I'm not as keen to pick up new technologies anymore. Maybe that's a burnout or something, but I'd say it's experience and what I like, that's stopping me from running after the latest and greatest. I'm C++ developer, by this I mean, I love close to metal programming. I have no problems tracing problems through assembly, using tools like WinDbg or HexView. When I use constructs, I think about how they are realized underneath, how the bits are set and unset under the hood. I love battling with complex threading problems and doing everything hardcore way, even by hand if the regular solutions seem half baked. But I also love the C++0x stuff, and use it a lot. And all C++ code as long as it's not cumbersome compared to C counterparts, sometimes I also fall back to sort of "Super C" if the C++ way is ugly. And then there are all other developers who seem to be way more forward looking, .Net 4.0 MVC, WPF, all those Microsoft X#s, LINQ languages, XML and XSLT, mobile devices and so on. I have done a considerable amount of .NET, SQL, ASPX programming, but the further I go, the less I want to try those technologies. Is that bad? Almost every day I hear people saying that managed code is the only way forward, WPF is the way to go. I hear that C++ is godawful, and you can't code anything in it that's somewhat stable. But I don't buy it. With the experience I have, and the knowledge of how native code is compiled and executes, I can say I find it extremely rare that C++ code is unstable, or leaks, or causes crashes that takes more than 30 seconds to identify and fix. And to tell the truth, I've seen enough problems with other "cool" languages that I'd say C++ is even more stable and production proof than the safe languages, at least for me. The only thing that scares me in C++ is new frameworks, I don't trust them, and I use them extra sparingly. STL - yes, ATL - very sparingly, everything else... Well, not very keen on it. Most huge problems I've ran into, all were related to frameworks, not the language itself. Some overrided operator here, bad hierarchy there, poor class design here, mystical castings there. Other than that, C/C++ (yes, I use them together) still seems a very controlled and stable way to develop applications. Am I stagnating? Should I switch a profession, or force myself in all that marketing hype? Are there more developers who feel the same way?

    Read the article

  • Connect to QuickBooks from PowerBuilder using RSSBus ADO.NET Data Provider

    - by dataintegration
    The RSSBus ADO.NET providers are easy-to-use, standards based controls that can be used from any platform or development technology that supports Microsoft .NET, including Sybase PowerBuilder. In this article we show how to use the RSSBus ADO.NET Provider for QuickBooks in PowerBuilder. A similar approach can be used from PowerBuilder with other RSSBus ADO.NET Data Providers to access data from Salesforce, SharePoint, Dynamics CRM, Google, OData, etc. In this article we will show how to create a basic PowerBuilder application that performs CRUD operations using the RSSBus ADO.NET Provider for QuickBooks. Step 1: Open PowerBuilder and create a new WPF Window Application solution. Step 2: Add all the Visual Controls needed for the connection properties. Step 3: Add the DataGrid control from the .NET controls. Step 4:Configure the columns of the DataGrid control as shown below. The column bindings will depend on the table. <DataGrid AutoGenerateColumns="False" Margin="13,249,12,14" Name="datagrid1" TabIndex="70" ItemsSource="{Binding}"> <DataGrid.Columns> <DataGridTextColumn x:Name="idColumn" Binding="{Binding Path=ID}" Header="ID" Width="SizeToHeader" /> <DataGridTextColumn x:Name="nameColumn" Binding="{Binding Path=Name}" Header="Name" Width="SizeToHeader" /> ... </DataGrid.Columns> </DataGrid> Step 5:Add a reference to the RSSBus ADO.NET Provider for QuickBooks assembly. Step 6:Optional: Set the QBXML Version to 6. Some of the tables in QuickBooks require a later version of QuickBooks to support updates and deletes. Please check the help for details. Connect the DataGrid: Once the visual elements have been configured, developers can use standard ADO.NET objects like Connection, Command, and DataAdapter to populate a DataTable with the results of a SQL query: System.Data.RSSBus.QuickBooks.QuickBooksConnection conn conn = create System.Data.RSSBus.QuickBooks.QuickBooksConnection(connectionString) System.Data.RSSBus.QuickBooks.QuickBooksCommand comm comm = create System.Data.RSSBus.QuickBooks.QuickBooksCommand(command, conn) System.Data.DataTable table table = create System.Data.DataTable System.Data.RSSBus.QuickBooks.QuickBooksDataAdapter dataAdapter dataAdapter = create System.Data.RSSBus.QuickBooks.QuickBooksDataAdapter(comm) dataAdapter.Fill(table) datagrid1.ItemsSource=table.DefaultView The code above can be used to bind data from any query (set this in command), to the DataGrid. The DataGrid should have the same columns as those returned from the SELECT statement. PowerBuilder Sample Project The included sample project includes the steps outlined in this article. You will also need the QuickBooks ADO.NET Data Provider to make the connection. You can download a free trial here.

    Read the article

  • Migrating a blog from Orchard 0.5 to 0.9

    - by Bertrand Le Roy
    My personal blog still runs on Orchard 0.5, because the theme that I used to build it is not yet available for more recent versions, but it is still very important for me to know that I can migrate all my content and comments to a new version at any time. Fortunately, Nick Mayne has been consistently shipping a BlogML module a few days after each of the Orchard versions shipped. Because the module gallery for each version is behind a different URL and is kept alive even after a new one shipped, it is very easy to install the module for both versions. Step 0: Setting up the migration environment In order to do the migration, I made a local copy of the production site on my laptop (data included: I'm using SQL CE) and I also created a new local site with a fresh install of Orchard 0.9. Step 1: Enable the gallery feature on both versions From the admin UI, go to Features and locate the Gallery feature under "Packaging". Enable it. You may now click on "Browse Gallery" on the 0.5 instance and "Modules" under "Gallery" for 0.9: Step 2: Install the BlogML module on both versions From the gallery page, locate the BlogML module and install it. Do it on both versions. Then go to Features and enable BlogML under "Content Publishing". Do it on both versions. Step 3: Export from the 0.5 version Click on "Manage Blog" then on "Export using BlogML" from the 0.5 version. The module then informs you of the path of the saved file: Step 4: Import into the 0.9 version From the 0.9 version, click "Import under "Blogs". Click the button to browse to the file that you just saved from 0.5. Then click "Upload file and Import" Step 5: Copy the 0.5 media folder into 0.9 Copy the contents of the 0.5 version's media folder into the media folder of the 0.9 version. Once that is done, you can delete the "Default/Blog Exports" subfolder. Step 6: Configure the target blog Click "Manage Blog", then "Blog Properties" and restore any properties you had on the source blog. For me, it was the title and URL as well as to set the blog as the home page and show it on the main menu: Step 7: Republish the new site to the production server Once this is done and everything works locally, you are ready to publish to the production site. I use FTP. Note: this should work just as well for any couple of versions for which the BlogML module exists, and not just for 0.5 and 0.9.

    Read the article

  • 0xC0017011 and other error messages - what is the error message text?

    Recently there was a bug raised against BIDS Helper which originated in my Expression Editor control. Thankfully the person that raised it kindly included a screenshot, so I had the error code (HRESULT 0xC0017011) and a stack trace that pointed the finger firmly at my control, but no error message text. The code itself looked fine so I searched on the error code but got no results. I’d expected to get a hit from Books Online with the Integration Services Error and Message Reference topic at the very least, but no joy. There is however a more accurate and definitive reference, namely the header file that defines all these codes dtsmsg.h which you can find at- C:\Program Files (x86)\Microsoft SQL Server\110\SDK\Include\dtsmsg.h Looking the code up in the header file gave me a much more useful error message. //////////////////////////////////////////////////////////////////////////// // The parameter is sensitive // // MessageId: DTS_E_SENSITIVEPARAMVALUENOTALLOWED // // MessageText: // // Accessing value of the parameter variable for the sensitive parameter "%1!s!" is not allowed. Verify that the variable is used properly and that it protects the sensitive information. // #define DTS_E_SENSITIVEPARAMVALUENOTALLOWED ((HRESULT)0xC0017011L) Unfortunately I’d forgotten all about this. By the time I had remembered about it, the person who raised the issue had managed to narrow it down to something to do with having  sensitive parameter. Putting that together with the error message I’d finally found, a quick poke around in the code and I found the new GetSensitiveValue method which seemed to do the trick. The HResult fields are also listed online but it only shows the short error message, and it doesn’t include that all so important HRESULT value itself. So let this be a lesson to you (and me!), if you need to check  SSIS error go straight to the horses mouth - dtsmsg.h. This is particularly true when working with early builds, or CTP releases when we expect the documentation to be a bit behind. There is also a programmatic approach to getting better SSIS error messages. I should to take another look at the error handling in the control, or the way it is hosted in BIDS Helper. I suspect that if I use an implementation of Microsoft.SqlServer.Dts.Runtime.Wrapper.IDTSInfoEvents100 I could catch the error itself and get the full error message text which I could then report back. This would obviously be a better user experience and also make it easier to diagnose any issues like this in the future. See ExprssionEvaluator.cs for an example of this in use in the Expression Editor control.

    Read the article

  • Why does aptitude want to remove a bunch of files?

    - by Mediterran81
    Recently I encountered dependencies resolve issues when using APTITUDE (it is my favorite). Nevertheless, I started to feel that APTITUDE does not behave as it is supposed to be in 64 bits systems while apt-get works fine. Can someone confirm that APTITUDE is buggy in Ubuntu 11.10 amd64? Edit: For example, when tried to install ntfs-config using APTITUDE, it asked me to remove over 100 packages (skype for example), while using apt-get worked fine. han@L502X:~$ sudo aptitude install ntfs-config [sudo] password for han: The following NEW packages will be installed: ntfs-3g{ab} ntfs-config 0 packages upgraded, 2 newly installed, 0 to remove and 0 not upgraded. Need to get 0 B/640 kB of archives. After unpacking 2,466 kB will be used. The following packages have unmet dependencies: ntfs-3g: Conflicts: ntfsprogs but 2.0.0-1ubuntu4 is installed. The following actions will resolve these dependencies: Remove the following packages: 1) flashplugin-downloader 2) flashplugin-installer 3) libasound2 4) libasound2-plugins 5) libasyncns0 6) libatk1.0-0 7) libaudio2 8) libavahi-client3 9) libavahi-common3 10) libc6 11) libcairo2 12) libcomerr2 13) libcups2 14) libcurl3 15) libdatrie1 16) libdb5.1 17) libdbus-1-3 18) libdbusmenu-qt2 19) libexpat1 20) libffi6 21) libflac8 22) libfontconfig1 23) libfreetype6 24) libgcc1 25) libgcrypt11 26) libgdk-pixbuf2.0-0 27) libglib2.0-0 28) libgnutls26 29) libgpg-error0 30) libgssapi-krb5-2 31) libgtk2.0-0 32) libice6 33) libidn11 34) libjack-jackd2-0 35) libjasper1 36) libjpeg62 37) libjson0 38) libk5crypto3 39) libkeyutils1 40) libkrb5-3 41) libkrb5support0 42) liblcms1 43) libldap-2.4-2 44) libmng1 45) libnspr4 46) libnspr4-0d 47) libnss3 48) libnss3-1d 49) libogg0 50) libpango1.0-0 51) libpcre3 52) libpixman-1-0 53) libpng12-0 54) libpulse0 55) libqt4-dbus 56) libqt4-declarative 57) libqt4-network 58) libqt4-script 59) libqt4-sql 60) libqt4-xml 61) libqt4-xmlpatterns 62) libqtcore4 63) libqtgui4 64) librtmp0 65) libsamplerate0 66) libsasl2-2 67) libsasl2-modules 68) libselinux1 69) libsm6 70) libsndfile1 71) libspeexdsp1 72) libsqlite3-0 73) libssl1.0.0 74) libstdc++6 75) libtasn1-3 76) libthai0 77) libtiff4 78) libuuid1 79) libvorbis0a 80) libvorbisenc2 81) libwrap0 82) libx11-6 83) libxau6 84) libxcb-render0 85) libxcb-shm0 86) libxcb1 87) libxcomposite1 88) libxcursor1 89) libxdamage1 90) libxdmcp6 91) libxext6 92) libxfixes3 93) libxft2 94) libxi6 95) libxinerama1 96) libxrandr2 97) libxrender1 98) libxss1 99) libxt6 100) libxv1 101) nspluginviewer 102) nspluginwrapper 103) ntfsprogs 104) skype 105) sni-qt 106) zlib1g Leave the following dependencies unresolved: 107) flashplugin-downloader recommends libasound2-plugins (>= 1.0.16) Accept this solution? [Y/n/q/?]

    Read the article

  • After 10 Years, MySQL Still the Right Choice for ScienceLogic's "Best Network Monitoring System on the Planet"

    - by Rebecca Hansen
    ScienceLogic has a pretty fantastic network monitoring appliance.  So good in fact that InfoWorld gave it their "2013 Best Network Monitoring System on the Planet" award.  Inside their "ultraflexible, ultrascalable, carrier-grade" enterprise appliance, ScienceLogic relies on MySQL and has since their start in 2003.  Check out some of the things they've been able to do with MySQL and their reasons for continuing to use MySQL in these highlights from our new MySQL ScienceLogic case study. Science Logic's larger customers use their appliance to monitor and manage  20,000+ devices, each of which generates a steady stream of data and a workload that is 85% write. On a large system, the MySQL database: Averages 8,000 queries every second or about 1 billion queries a day Can reach 175,000 tables and up to 20 million rows in a single table Is 2 terabytes on average and up to 6 terabytes "We told our customers they could add more and more devices. With MySQL, we haven't had any problems. When our customers have problems, we get calls. Not getting calls is a huge benefit." Matt Luebke, ScienceLogic Chief Software Architect.? ScienceLogic was approached by a number of Big Data / NoSQL vendors, but decided against using a NoSQL-only solution. Said Matt, "There are times when you really need SQL. NoSQL can't show me the top 10 users of CPU, or show me the bottom ten consumer of hard disk. That's why we weren't interested in changing and why we are very interested in MySQL 5.6. It's great that it can do relational and key-value using memcached." The ScienceLogic team is very cautious about putting only very stable technology into their product, and according to Matt, MySQL has been very stable: "We've been using MySQL for 10 years and we have never had any reliability problems. Ever." ScienceLogic now uses SSDs for their write-intensive appliance and that change alone has helped them achieve a 5x performance increase. Learn more>> ScienceLogic MySQL Case Study MySQL 5.6 InnoDB Compression options for better SSD performance Tuning MySQL 5.6 for Great Product Performance - on demand webinar Developer and DBA Guide to MySQL 5.6 white paper Guide to MySQL and NoSQL: The Best of Both Worlds white paper

    Read the article

  • Hello From South Florida

    - by Sam Abraham
    Fellow Blog Readers: I figured I use my first blog post on GeeksWithBlogs to introduce myself.   I recently relocated from Long Island, NY to South Florida where I joined a local company as Software Engineer specializing in technologies such as C#, ASP.Net 3.5, WCF, Silverlight, SQL Server 2008 and LINQ, to name a few. I am an MCP and MCTS ASP.Net 3.5, looking to get my .Net 4.0 certification soon.   Having been in industry for a few years so far, I figured I would share with you my take on the importance of being involved(at least attending) in local user groups.   I am a firm believer that besides using a certain technology, the best way to expand one’s knowledge is by sharing it with others and being equally open to learn from others just as much as you are willing to share what you know.   In my opinion, an important factor that makes a good developer stand-out is his/her ability to keep abreast with the latest and greatest even in areas outside his/her direct expertise.   Additionally, having spoken to various recruiters, technical user group attendees are always favorably looked upon as genuinely interested in their field and willing to take the initiative to expand their knowledge which offers job candidates good leverage when competing for jobs.   I believe I am very blessed to be in an area with a very strong and vibrant developer community. I found in the local .Net community leadership a genuine interest in constantly extending the opportunity to all developers to get more involved and encouraging those who are willing to take that initiative achieve their goal: Speak in meetings, volunteer at events or write and publish articles/blogs about latest and greatest technologies.   With Vishal Shukla (Site director for the West Palm Beach .Net User Group) traveling overseas, I have been extended the opportunity to come on board as site coordinator for FladotNet's WPB .Net User Group along with Venkata Subramanian, an opportunity which I gratefully accepted.   Being involved in running a .Net User Group will surely help me personally and professionally, but my real hope is to use this opportunity to assist in delivering the ultimate common-goal: spread the word about new .Net Technologies, help everybody get more involved and simply have fun learning new things.   With my introduction out of the way, in the next few days I will be posting some notes on an upcoming talk I will be giving about MVC2 and VS2010 in mid-April.   Environment.Exit(0); --Sam

    Read the article

  • Why I Love Microsoft Development

    - by Brian Lanham
    I've been writing software for a while and recently had an opportunity to broaden my horizons and start developing for iOS. We decided to leverage, as much as possible, our existing skills and use MonoTouch and MonoDevelop by Novell.    For those of you who do not know, Mono is a .NET port originally designed for Linux but adapted for other platforms as well. MonoTouch is a port specifically for building iOS applications using the .NET framework. MonoDroid is a port (in CTP-esque release) for Android.   A MISSING COMPONENT - VISUAL DESIGNER   MonoDevelop lacks one very significant component compared with other tools I am using: NO VISUAL DESIGNER. Instead of using an integrated visual designer, MonoDevelop shells to the Mac OS "Interface Builder".  Since MonoDevelop lets me have a "Visual Studio-esque" feel *and* I get to use C#, AND it's FREE, I am gladly willing to overlook this.  In fact, it's not even a question.  Free?  Sure, I'll take it with no Visual Designer.   In my experiences I've grown from UNIX and DOS to .NET development through many steps. Java/JSP/Servlets; Windows; Web; etc. I've been doing .NET for quite a few years and I guess I just got "comfortable" with the tools.   WHY AM I NOT GETTING IT?   Interface Builder (IB) is amazingly confusing for me. I had the opportunity to speak at the Northern VA Code Camp on 12/11/2010. My presentation was "Getting Started with iOS Development using MonoTouch and C#".    At the visual design part of the presentation, I asked one of the 3 or 4 Mac developers in the room about my confusion with the IB. I don't understand why the "Classes" list includes objects. I don't understand what "File's Owner" is. And, most importantly, WHAT THE HECK IS AN OUTLET AND WHY IS IT NECESSARY?!?!?"   His response to these question (especially Outlets): "They did it wrong."   I'm accustom to a visual designer that creates variables for graphical widgets for me. Not IB. Instead, I have to create "Outlets" manually. I still do not understand why and, the explanation from a seasoned Mac developer is that it's wrong. (He received nods of confirmation from the other Mac devs in the room.)   I LOVE MS DEV   I love development for Microsoft platforms using Microsoft development tools. I love Windows 7. I love Visual Studio 2010. I love SQL Server. Azure, Entity Framework, Active Directory, Office, WCF/WF/WPF, etc. are all designed with integration in mind. They are also all designed with developers in mind.   Steve Ballmer recently ranted "It's the developers!" That's why it is relatively quick to build apps using MS tools. Clearly, MS knows that while we usually enjoy building technology solutions, we are here to make money. And we need tools that accelerate our time to market without compromising the power and quality of our solutions.   So, yeah, I am sucking up I guess. But I love Microsoft Development. Thank you, Microsoft, for providing the plethora of great development tools.    P.S. (but please slow down a bit…I'm having trouble keeping up!)

    Read the article

  • ArchBeat Link-o-Rama for 2012-06-14

    - by Bob Rhubart
    Duke's Choice Award Nominations Close Friday! | The Java Source The Duke's Choice Awards celebrate extreme innovation in the world of Java technology. Nominate an individual, a group or company who show the best in Java innovation. Nominate at Java.net/dukeschoice. Nominations are open until this Friday, June 15. Whole Lotta Virtualization Goin' On | Rick Ramsey The OTN Garage's Rick Ramsey shares a list of recent Virtualization articles available on OTN, along with a link to a video by The Killer, Mr Jerry Lee Lewis. A Pragmatic Path to Navigating your Infrastructure to the Cloud | The WebLogic Server Blog Ruma Sanyal offers an overview of a recent Oracle webcast featuring Gartner VP and Distinguished Analyst Andy Butler and Vice President and Gartner Fellow Massimo Pezzini. Migrating C/C++ embedded SQL code | Tom Laszewski Cloud migration expert Tom Laszewski explains the how-to in 5 easy steps. Aetna Dumps Its Siloed Enterprise Architecture for SOA | CIO.com CIO writer Stephanie Overby tells the story of how one major health insurance provider put the "Enterprise" back in Enterprise Architecture. (H/T to Joe McKendrick for this story.) Downloading specific video renditions in WebCenter Content | Kyle Hatlestad How-to from Oracle WebCenter & ADF A-Team blogger Kyle Hatlestad. Eclipse DemoCamp - June 2012 - Redwood Shores, CA Location: Oracle HQ - 10 Twin Dolphin Drive, Redwood Shores, CA (Map) Date and Time: Wednesday, June 13, 2012. From 6pm - 9pm Agenda: The evolution of Java persistence, Doug Clarke, EclipseLink Project Lead, Oracle Integrating BIRT into Applications, Ashwini Verma, Actuate Corporation Leveraging OSGi In The Enterprise, Kamal Muralidharan, Lead Engineer, eBay Developing Rich ADF Applications with Java EE, Greg Stachnick, Oracle NVIDIA® NsightTM Eclipse Edition, Goodwin (Tech lead - Visual tools), Eugene Ostroukhov (Senior engineer – Visual tools) 2012 Oracle Fusion Middleware Innovation Awards - Win a FREE Pass to Oracle OpenWorld 2012 in San Francisco Share your use of Oracle Fusion Middleware solutions and how they help your organization drive business innovation. You just might win a free pass to Oracle Openworld 2012 in San Francisco. Deadline for submissions in July 17, 2012. BI Architecture Master Class for Partners – Oracle Architecture Unplugged Date: June 21, 2012 No slides, no fluff. This workshop will be highly interactive and is aimed at Oracle OPN member partners who are IT Architects and BI+W specialists. The focus will be on architectural issues and considerations. DevOps: Evolving to Handle Disruption | JP Morgenthal The subject of DevOps came up this week during an OTN ArchBeat podcast interview with Ron Batra and James Baty on the role of the cloud architect (that program will be available in a few weeks). Morgenthal's article for InfoQ offers a good overview of what DevOps is and how it works. Thought for the Day "Elegance is not a dispensable luxury but a factor that decides between success and failure." — Edsger Dijkstra Source: softwarequotes.com

    Read the article

  • Happy New Year! Upcoming Events in January 2011

    - by mandy.ho
    Oracle Database kicks off the New Year at the following events during the month of January. Hope to see you there and please send in your pictures and feedback! Jan 20, 2011 - San Francisco, CA LinkShare Symposium West 2011 Oracle is a proud Gold Sponsor at the LinkShare Symposium West 2011 January 20 in San Francisco, California. Year after year LinkShare has been bringing their network the opportunity to come to life. At the LinkShare Symposium online performance marketing leaders meet to optimize face-to-face during a full day of networking. Learn more by attending Oracle Breakout Session, "Omni - Channel Retailing, What is possible now?" on Thursday, January 20, 11:15 a.m. - 12:00 noon, Grand Ballroom. http://eventreg.oracle.com/webapps/events/ns/EventsDetail.jsp?p_eventId=128306&src=6954634&src=6954634&Act=397 Jan 24, 2011 - Cincinnati, OH Greater Cincinnati Oracle User Group Meeting "Tom Kyte Day" - Featuring a day of sessions presented by Senior Technical Architect, Tom Kyte. Sessions include "Top 10, no 11, new features of Oracle Database 11g Release 2" and "What do I really need to know when upgrading", plus more. http://www.gcoug.org/ Jan 25, 2011 - Vancouver, British Columbia Oracle Security Solutions Forum Featuring a Special Keynote Presentation from Tom Kyte - Complete Database Security Join us at this half-day event; Oracle Database Security Solutions: Complete Information Security. Learn how Oracle Database Security solutions help you: • Prevent external threats like SQL injection attacks from reaching your databases • Transparently encrypt application data without application changes • Prevent privileged database users and administrators from accessing data • Use native database auditing to monitor and report on database activity • Mask production data for safe use in nonproduction environments http://eventreg.oracle.com/webapps/events/ns/EventsDetail.jsp?p_eventId=126974&src=6958351&src=6958351&Act=97 Jan 26, 2011 - Halifax, Nova Scotia Oracle Database Security Technology Day Exclusive Seminar on Complete Information Security with Oracle Database 11g The amount of digital data within organizations is growing at unprecedented rates, as is the value of that data and the challenges of safeguarding it. Yet most IT security programs fail to address database security--specifically, insecure applications and privileged users. So how can you protect your mission-critical information? Avoid risky third-party solutions? Defend against security breaches and compliance violations? And resist costly new infrastructure investments? Join us at this half-day seminar, Oracle Database Security Solutions: Complete Information Security, to find out http://eventreg.oracle.com/webapps/events/ns/EventsDetail.jsp?p_eventId=126269&src=6958351&src=6958351&Act=93

    Read the article

  • WPF: Reloading app parts to handle persistence as well as memory management.

    - by Ingó Vals
    I created a app using Microsoft's WPF. It mostly handles data reading and input as well as associating relations between data within specific parameters. As a total beginner I made some bad design decision ( not so much decisions as using the first thing I got to work ) but now understanding WPF better I'm getting the urge to refactor my code with better design principles. I had several problems but I guess each deserves it's own question for clarity. Here I'm asking for proper ways to handle the data itself. In the original I wrapped each row in a object when fetched from database ( using LINQ to SQL ) somewhat like Active Record just not active or persistence (each app instance had it's own data handling part). The app has subunits handling different aspects. However as it was setup it loaded everything when started. This creates several problems, for example often it wouldn't be neccesary to load a part unless we were specifically going to work with that part so I wan't some form of lazy loading. Also there was problem with inner persistance because you might create a new object/row in one aspect and perhaps set relation between it and different object but the new object wouldn't appear until the program was restarted. Persistance between instances of the app won't be huge problem because of the small amount of people using the program. While I could solve this now using dirty tricks I would rather refactor the program and do it elegantly, Now the question is how. I know there are several ways and a few come to mind: 1) Each aspect of the program is it's own UserControl that get's reloaded/instanced everytime you navigate to it. This ensures you only load up the data you need and you get some persistancy. DB server located on same LAN and tables are small so that shouldn't be a big problem. Minor drawback is that you would have to remember the state of each aspect so you wouldn't always start at beginners square. 2) Having a ViewModel type object at the base level of the app with lazy loading and some kind of timeout. I would then propegate this object down the visual tree to ensure every aspect is getting it's data from the same instance 3) Semi active record data layer with static load methods. 4) Some other idea What in your opinion is the most practical way in WPF, what does MVVM assume?

    Read the article

  • How to join two collections with LINQ

    - by JustinGreenwood
    Here is a simple and complete example of how to perform joins on two collections with LINQ. I wrote it for a friend to show him, in one simple file, the power of LINQ queries and anonymous objects. In the file below, there are two simple data classes defined: Person and Item. In the beginning of the main method, two collections are created. Note that the Item's OwnerId field reference the PersonId of a Person object. The effect of the LINQ query below is equivalent to a SQL statement looking like this: select Person.PersonName as OwnerName, Item.ItemName as OwnedItem from Person inner join Item on Item.OwnerId = Person.PersonId order by Item.ItemName desc; using System; using System.Collections.Generic; using System.Linq; namespace LinqJoinAnonymousObjects { class Program { class Person { public int PersonId { get; set; } public string PersonName { get; set; } } class Item { public string ItemName { get; set; } public int OwnerId { get; set; } } static void Main(string[] args) { // Create two collections: one of people, and another with their possessions. var people = new List<Person> { new Person { PersonId=1, PersonName="Justin" }, new Person { PersonId=2, PersonName="Arthur" }, new Person { PersonId=3, PersonName="Bob" } }; var items = new List<Item> { new Item { OwnerId=1, ItemName="Armor" }, new Item { OwnerId=1, ItemName="Book" }, new Item { OwnerId=2, ItemName="Chain Mail" }, new Item { OwnerId=2, ItemName="Excalibur" }, new Item { OwnerId=3, ItemName="Bubbles" }, new Item { OwnerId=3, ItemName="Gold" } }; // Create a new, anonymous composite result for person id=2. var compositeResult = from p in people join i in items on p.PersonId equals i.OwnerId where p.PersonId == 2 orderby i.ItemName descending select new { OwnerName = p.PersonName, OwnedItem = i.ItemName }; // The query doesn't evaluate until you iterate through the query or convert it to a list Console.WriteLine("[" + compositeResult.GetType().Name + "]"); // Convert to a list and loop through it. var compositeList = compositeResult.ToList(); Console.WriteLine("[" + compositeList.GetType().Name + "]"); foreach (var o in compositeList) { Console.WriteLine("\t[" + o.GetType().Name + "] " + o.OwnerName + " - " + o.OwnedItem); } Console.ReadKey(); } } } The output of the program is below: [WhereSelectEnumerableIterator`2] [List`1] [<>f__AnonymousType1`2] Arthur - Excalibur [<>f__AnonymousType1`2] Arthur - Chain Mail

    Read the article

  • 302: this blog will be closed

    - by preishuber
    After nearly 7 years I will discontinue blogging on this site. My resources are limited. You can reach my German blog which is used to support my customers. Looking back to a long an interesting journey ASP.NET by ScottGu That was the reason to attend this site and support Microsoft as much as I can. For that I was honored as ASP.NET MVP- thanks again. Meet Scoot several times. Great guy! Forums I have left NNTP forums a few years ago and now Microsoft closed it- It was my idea ;-) AJAX Was the wrong way- JQuery won the game IIS7 That is really a great plattform and the IIS team rules. I am sad that is so silent around that topic. ASP.NET after 2.0 Is no longer my world. I love ASP.NET and ASP.NET Server controls. I hate the discussion about how to follow the holy rules of MVC. Microsoft have dropped the goal to bring ASP.NET to #1 and accepted PHP is it. Facebook & Twittering Microblogging takes over a part of the blogging business. Shorter faster cheaper- or as SteveB mentioned - do more with less. Google Google is taking over the web. I am using Bing every time as I can but Google have more options. Sorry Microsoft you will loose that game. Apple That is not the biggest problem of Microsoft. the Ixxx takes over a small part but big money of the market, but the customers are not strongly linked. New wave new hype- Game over Apple. Silverlight My new home. I can reuse a lot of my skills and love the possibilitys. Silverligth will passing WPF-and strike Flash Windows phone 7 Also my skills fit. I just will use it for fun. I am not really satisfied about what I have heard from MIX. Guys from Redmond, I am sad to say you have been the best Smartphone OS and lost everything. The ADO vNext Story That will be the next mystic point. WCF, REST, JSON, ATOM and now OData. Nothing about SQL commands. LINQ, ORM is also not the final solution for multilayered disconnected async scenarios. Personally I prefere the OData idea and dislike the Swiss Army Knife (German Eierlegende Wollmilchsau) WCF. I am still in INETA Speakers board and I am glad to come to your user group. In all other cases you can hire me over ppedv AG. Good by and have good live.

    Read the article

  • Microsoft Virtual Academy

    Carpe Diem It's been since a while that I could write an article for this blog but alas, I was (and still am) very busy with customer's work. Which is actually good. So, what is this article going to tell you? Well, in general, just what I already tweeted, that life is constant process of learning - especially as software craftsman. Due to an upcoming new customer project in ASP.NET I had to seize the opportunity to get my head deeper into latest available technologies, like Windows Azure and SQL Azure. I know... cloud computing and so on is not a recent development and already available since quite a while but I never any means to get myself into this since roughly two weeks ago. Microsoft Virtual Academy I can't remember exactly what guided me towards the Microsoft Virtual Academy (MVA), oh wait... Yes, it was a posting on Facebook from an old CLIP community friend. He posted a shortened URL with #MVA tag that caught my attention. Thanks for that Thomas Kuberek. After the usual sign in or registration via Live ID I was a little bit surprised that Mauritius is not an available country option... Quick mail exchange with the MVA Decan, and yeah, apologies for the missing entry. So, currently I'm learning about Microsoft products and services, and collecting points under "Not Listed Country" until Mauritius is going to be added. Hopefully soon, as MVA honors your effort with different knowledge ranks that are compared to other students with public profiles. I think it's a nice move to add some game and competition factor into the learning game. The tracks and their different modules are mainly references to publicly available material online, namely on either MSDN, TechNet, Channel9, or other Microsoft based sites. The course material therefore also varies in different media and formats, ranging from simple online articles over downloadable documents (.docx or .pdf) to Silverlight / Windows Media streams with download options. Self-assessment and students ranking Each module in a track can be finished by taking part in a self-assessment. Up to now, the assessment I did (and passed) were limited to 10 minutes available time, and consisted of six to seven questions on the module training material. Nothing too serious but it gives you a glimpse idea how Microsoft certification exams are structured. Conclusion Nothing really new but nicely gathered, assembled and presented to the MVA students. At the moment, I wouldn't dare to compare the richness and quality of those courses with professional training offers, like Pluralsight .NET Training, LearnDevNow, VTC, etc. at all, but I think that MVA has potential. Give it a try, and let me know about your opinions.

    Read the article

  • What do you think of the EntLib 5.0 configuration tool?

    Hello again! Its been a while, I know. Ive been busy over the last few months with several projects, some of them software related, and one of them human my son Jesse was born on 26 February 2010. Fun times! Meanwhile, back in Redmond, the p&p team has been busy working on Enterprise Library 5.0 see Grigoris announcement for details on the beta. Theres a ton of new stuff in this release, but theres one big new feature that hasnt received a lot of attention that Im keen to hear your perspectives on. The change is the biggest overhaul to the configuration tool since Enterprise Library was launched. If you havent yet grabbed the EntLib 5.0 beta, heres a before and after shot of the config tool: Enterprise Library 4.1 config tool Enterprise Library 5.0 (beta 1) config tool The tool has been rebuilt from the ground up in response to some feedback and usability studies from the previous version of the tool. But is this a step in the right direction? Id love to hear what you think. If youve downloaded EntLib 5.0 and tried out the tool, please share your thoughts on: First impressions. Is the tool easy to understand? Easy to find what youre looking for? Easy to read existing configuration? Pretty? Ease of use for real life tasks. Rather than make up your own tasks, here are a few sample scenarios you might want to try: Configure the data access block with a SQL Server connection called Audit that points to a database called Audit on a server called DB Configure the logging block so that any log entries in the Audit category are written to both the Event Log and the Audit database (see above) Configure the validation block with a ruleset called Email Address that uses an appropriate regular expression for e-mail addresses Configure the policy injection block such that any calls to classes in the MyCompany.Security namespace are logged before and after the call using the Audit category (see above) Comparison with the old config tool. What do you like better in the new tool? What did you like better in the old tool? How do you rate your level of expertise using the old tool? Keep in mind that I no longer work in the p&p team, so I cant say how any of this feedback will be used (although Im sure the team is listening!). However since Ive invested so much time in Enterprise Library, both in leading the team and using the product on real projects Im very interested to hear what you all think of the tools new direction.Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Backup Azure Tables with the Enzo Backup API

    - by Herve Roggero
    In case you missed it, you can now backup (and restore) Azure Tables and SQL Databases using an API directly. The features available through the API can be found here: http://www.bluesyntax.net/backup20api.aspx and the online help for the API is here: http://www.bluesyntax.net/EnzoCloudBackup20/APIIntro.aspx. Backing up Azure Tables can’t be any easier than with the Enzo Backup API. Here is a sample code that does the trick: // Create the backup helper class. The constructor automatically sets the SourceStorageAccount property StorageBackupHelper backup = new StorageBackupHelper("storageaccountname", "storageaccountkey", "sourceStorageaccountname", "sourceStorageaccountkey", true, "apilicensekey"); // Now set some properties… backup.UseCloudAgent = false;                                       // backup locally backup.DeviceURI = @"c:\TMP\azuretablebackup.bkp";    // to this file backup.Override = true; backup.Location = DeviceLocation.LocalFile; // Set optional performance options backup.PKTableStrategy.Mode = BSC.Backup.API.TableStrategyMode.GUID; // Set GUID strategy by default backup.MaxRESTPerSec = 200; // Attempt to stay below 200 REST calls per second // Start the backup now… string taskId = backup.Backup(); // Use the Environment class to get the final status of the operation EnvironmentHelper env = new EnvironmentHelper("storageaccountname", "storageaccountkey", "apilicensekey"); string status = env.GetOperationStatus(taskId);   As you can see above, the code is straightforward. You provide connection settings in the constructor, set a few options indicating where the backup device will be located, set optional performance parameters and start the backup. The performance options are designed to help you backup your Azure Tables quickly, while attempting to keep under a specific threshold to prevent Storage Account throttling. For example, the MaxRESTPerSec property will attempt to keep the overall backup operation under 200 rest calls per second. Another performance option if the Backup Strategy for Azure Tables. By default, all tables are simply scanned. While this works best for smaller Azure Tables, larger tables can use the GUID strategy, which will issue requests against an Azure Table in parallel assuming the PartitionKey stores GUID values. It doesn’t mean that your PartitionKey must have GUIDs however for this strategy to work; but the backup algorithm is tuned for this condition. Other options are available as well, such as filtering which columns, entities or tables are being backed up. Check out more on the Blue Syntax website at http://www.bluesyntax.net.

    Read the article

  • Maximum Availability with Oracle GoldenGate

    - by Irem Radzik
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 Oracle Database offers a variety of built-in and optional products for maximum availability, and it is well known for its robust high availability and disaster recovery solutions. With its heterogeneous, real-time, transactional data movement capabilities, Oracle GoldenGate is a key part of the Maximum Availability Architecture for Oracle Database. This week on Thursday Dec. 13th we will be presenting in a live webcast how Oracle GoldenGate fits into Oracle Database Maximum Availability Architecture (MAA). Joe Meeks from the Oracle Database High Availability team will discuss how Oracle GoldenGate complements other key products within MAA such as Active Data Guard. Nick Wagner from GoldenGate PM team will present how to upgrade to latest Oracle Database release without any downtime. Nick will also cover 2 new features of  Oracle GoldenGate 11gR2:  Integrated Capture for Oracle Database and Automated Conflict Detection and Resolution. Nick will provide in depth review of these new features with examples. Oracle GoldenGate also offers maximum availability for non-Oracle databases, such as HP NonStop, SQL Server, DB2 (LUW, iSeries, or zSeries) and more. The same robust, reliable real-time, bidirectional data movement capabilities apply to all supported databases.  I'd like to invite you to join us on Thursday Dec. 13th 10am PT/1pm ET to hear from the product experts on how to use GoldenGate for maximizing database availability and to ask your questions. You can find the registration link below. Webcast: Maximum Availability with Oracle GoldenGate Thursday Dec. 13th 10am PT/1pm ET Look forward to another great webcast with lots of interaction with the audience. /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;}

    Read the article

  • Launch 2010 Technical Readiness Unofficial Q&amp;A.

    - by mbcrump
    I had an email from one of my readers about the 2010 Technical Readiness Series. Please read below: Hi Michael, I noticed you blogged a while back that you were going to attend MS 2010 Launch event. I’m going to the session in Seattle on May 27, is it worth it to attend? Also, I’m wondering if they give any free software away like VS2010 Pro? Any decent vendors? Looking forward to hearing back from you. I decided this information would probably benefit several instead of just responding back to the reader. In case you are not aware, MS has a 2010 launch event showing VS2010, Office 2010, SharePoint 2010 and SQL Server 2008R2.  You can sign up for an event here: https://microsoft.crgevents.com/Register2010/Content/Event_Selection.aspx I’ve answered the questions asked below. Q: Is it worth going? A: It is if you have had little or no exposure to the latest 2010 products. Most people are familiar with VS2010, but have not seen SharePoint 2010 Office 2010 or Windows Phone 7. It is designed to get you up to speed very quickly. If you have watched most of the MIX videos and keep up with .NET in general, you will benefit more by having the ability to ask questions.    Q: Did you get any free software? A: No, only demos including: VS2010/Office 2010 – They give you a link to the url where you can download the trial version. Windows 7 Enterprise – You get a DVD with the trial version loaded. Q: Do they give away any cool swag? A: Just a Microsoft T-Shirt (XL). Q: What about the vendor selection? A: At the event that I went to, most vendors were pushing SharePoint products. There wasn’t a lot of variety in the selection. Most vendors were giving away the typical pens, buttons and stickers and trial software. If you have any other questions, feel free to contact me. I will answer and add to this un-official FAQ.

    Read the article

< Previous Page | 1014 1015 1016 1017 1018 1019 1020 1021 1022 1023 1024 1025  | Next Page >