Search Results

Search found 33316 results on 1333 pages for 'sql team'.

Page 933/1333 | < Previous Page | 929 930 931 932 933 934 935 936 937 938 939 940  | Next Page >

  • apostrophe in mysql/php

    - by fusion
    i'm trying to learn php/mysql. inserting data into mysql works fine but inserting those with apostrophe is generating an error. i tried using mysql_real_escape_string, yet this doesn't work. would appreciate any help. <?php include 'config.php'; echo "Connected <br />"; $auth = $_POST['author']; $quo = $_POST['quote']; $author = mysql_real_escape_string($auth); $quote = mysql_real_escape_string($quo); //************************** //inserting data $sql="INSERT INTO Quotes (vauthor, cquotes) VALUES ($author, $quote)"; if (!mysql_query($sql,$conn)) { die('Error: ' . mysql_error()); } echo "1 record added"; ... what am i doing wrong?

    Read the article

  • Django + FastCGI - randomly raising OperationalError

    - by ibz
    I'm running a Django application. Had it under Apache + mod_python before, and it was all OK. Switched to Lighttpd + FastCGI. Now I randomly get the following exception (neither the place nor the time where it appears seem to be predictable). Since it's random, and it appears only after switching to FastCGI, I assume it has something to do with some settings. Found a few results when googleing, but they seem to be related to setting maxrequests=1. However, I use the default, which is 0. Any ideas where to look for? PS. I'm using PostgreSQL. Might be related to that as well, since the exception appears when making a database query. Thanks. File "/usr/lib/python2.6/site-packages/django/core/handlers/base.py", line 86, in get_response response = callback(request, *callback_args, **callback_kwargs) File "/usr/lib/python2.6/site-packages/django/contrib/admin/sites.py", line 140, in root if not self.has_permission(request): File "/usr/lib/python2.6/site-packages/django/contrib/admin/sites.py", line 99, in has_permission return request.user.is_authenticated() and request.user.is_staff File "/usr/lib/python2.6/site-packages/django/contrib/auth/middleware.py", line 5, in __get__ request._cached_user = get_user(request) File "/usr/lib/python2.6/site-packages/django/contrib/auth/__init__.py", line 83, in get_user user_id = request.session[SESSION_KEY] File "/usr/lib/python2.6/site-packages/django/contrib/sessions/backends/base.py", line 46, in __getitem__ return self._session[key] File "/usr/lib/python2.6/site-packages/django/contrib/sessions/backends/base.py", line 172, in _get_session self._session_cache = self.load() File "/usr/lib/python2.6/site-packages/django/contrib/sessions/backends/db.py", line 16, in load expire_date__gt=datetime.datetime.now() File "/usr/lib/python2.6/site-packages/django/db/models/manager.py", line 93, in get return self.get_query_set().get(*args, **kwargs) File "/usr/lib/python2.6/site-packages/django/db/models/query.py", line 304, in get num = len(clone) File "/usr/lib/python2.6/site-packages/django/db/models/query.py", line 160, in __len__ self._result_cache = list(self.iterator()) File "/usr/lib/python2.6/site-packages/django/db/models/query.py", line 275, in iterator for row in self.query.results_iter(): File "/usr/lib/python2.6/site-packages/django/db/models/sql/query.py", line 206, in results_iter for rows in self.execute_sql(MULTI): File "/usr/lib/python2.6/site-packages/django/db/models/sql/query.py", line 1734, in execute_sql cursor.execute(sql, params) OperationalError: server closed the connection unexpectedly This probably means the server terminated abnormally before or while processing the request.

    Read the article

  • Oracle Linked Server error: ORA-12640: Authentication adapter initialization failed

    - by Chenster
    I have a linked server on SQL Server that talks to Oracle. Executing the following sql statement using Openquery SELECT * FROM OPENQUERY(finance, 'select * from KFRI.VW_XREF_PROJECTS') will get error as the following: OLE DB provider "OraOLEDB.Oracle" for linked server "finance" returned message "ORA-12640: Authentication adapter initialization failed". Msg 7303, Level 16, State 1, Line 1 Cannot initialize the data source object of OLE DB provider "OraOLEDB.Oracle" for linked server "finance". I tried to set : SQLNET.AUTHENTICATION_SERVICES= (NONE) in {$ORACLE_HOME}\NETWORK\ADMIN\sqlnet.ora. It did not help. What's interesting is my coworker is able to execute the exactly same query successfully on his machine without a hitch. Any tips on how to fix this is greatly appreciated!!

    Read the article

  • IIS Strategies for Accessing Secured Network Resources

    - by ErikE
    Problem: A user connects to a service on a machine, such as an IIS web site or a SQL Server database. The site or the database need to gain access to network resources such as file shares (the most common) or a database on a different server. Permission is denied. This is because the user the service is running under doesn't have network permissions in the first place, or if it does, it doesn't have rights to access the remote resource. I keep running into this problem over and over again and am tired of not having a really solid way of handling it. Here are some workarounds I'm aware of: Run IIS as a custom-created domain user who is granted high permissions If permissions are granted one file share at a time, then every time I want to read from a new share, I would have to ask a network admin to add it for me. Eventually, with many web sites reading from many shares, it is going to get really complicated. If permissions are just opened up wide for the user to access any file shares in our domain, then this seems like an unnecessary security surface area to present. This also applies to all the sites running on IIS, rather than just the selected site or virtual directory that needs the access, a further surface area problem. Still use the IUSR account but give it network permissions and set up the same user name on the remote resource (not a domain user, a local user) This also has its problems. For example, there's a file share I am using that I have full rights to for sharing, but I can't log in to the machine. So I have to find the right admin and ask him to do it for me. Any time something has to change, it's another request to an admin. Allow IIS users to connect as anonymous, but set the account used for anonymous access to a high-privilege one This is even worse than giving the IIS IUSR full privileges, because it means my web site can't use any kind of security in the first place. Connect using Kerberos, then delegate This sounds good in principle but has all sorts of problems. First of all, if you're using virtual web sites where the domain name you connect to the site with is not the base machine name (as we do frequently), then you have to set up a Service Principal Name on the webserver using Microsoft's SetSPN utility. It's complicated and apparently prone to errors. Also, you have to ask your network/domain admin to change security policy for both the web server and the domain account so they are "trusted for delegation." If you don't get everything perfectly right, suddenly your intended Kerberos authentication is NTLM instead, and you can only impersonate rather than delegate, and thus no reaching out over the network as the user. Also, this method can be problematic because sometimes you need the web site or database to have permissions that the connecting user doesn't have. Create a service or COM+ application that fetches the resource for the web site Services and COM+ packages are run with their own set of credentials. Running as a high-privilege user is okay since they can do their own security and deny requests that are not legitimate, putting control in the hands of the application developer instead of the network admin. Problems: I am using a COM+ package that does exactly this on Windows Server 2000 to deliver highly sensitive images to a secured web application. I tried moving the web site to Windows Server 2003 and was suddenly denied permission to instantiate the COM+ object, very likely registry permissions. I trolled around quite a bit and did not solve the problem, partly because I was reluctant to give the IUSR account full registry permissions. That seems like the same bad practice as just running IIS as a high-privilege user. Note: This is actually really simple. In a programming language of your choice, you create a class with a function that returns an instance of the object you want (an ADODB.Connection, for example), and build a dll, which you register as a COM+ object. In your web server-side code, you create an instance of the class and use the function, and since it is running under a different security context, calls to network resources work. Map drive letters to shares This could theoretically work, but in my mind it's not really a good long-term strategy. Even though mappings can be created with specific credentials, and this can be done by others than a network admin, this also is going to mean that there are either way too many shared drives (small granularity) or too much permission is granted to entire file servers (large granularity). Also, I haven't figured out how to map a drive so that the IUSR gets the drives. Mapping a drive is for the current user, I don't know the IUSR account password to log in as it and create the mappings. Move the resources local to the web server/database There are times when I've done this, especially with Access databases. Does the database have to live out on the file share? Sometimes, it was just easiest to move the database to the web server or to the SQL database server (so the linked server to it would work). But I don't think this is a great all-around solution, either. And it won't work when the resource is a service rather than a file. Move the service to the final web server/database I suppose I could run a web server on my SQL Server database, so the web site can connect to it using impersonation and make me happy. But do we really want random extra web servers on our database servers just so this is possible? No. Virtual directories in IIS I know that virtual directories can help make remote resources look as though they are local, and this supports using custom credentials for each virtual directory. I haven't been able to come up with, yet, how this would solve the problem for system calls. Users could reach file shares directly, but this won't help, say, classic ASP code access resources. I could use a URL instead of a file path to read remote data files in a web page, but this isn't going to help me make a connection to an Access database, a SQL server database, or any other resource that uses a connection library rather than being able to just read all the bytes and work with them. I wish there was some kind of "service tunnel" that I could create. Think about how a VPN makes remote resources look like they are local. With a richer aliasing mechanism, perhaps code-based, why couldn't even database connections occur under a defined security context? Why not a special Windows component that lets you specify, per user, what resources are available and what alternate credentials are used for the connection? File shares, databases, web sites, you name it. I guess I'm almost talking about a specialized local proxy server. Anyway, so there's my list. I may update it if I think of more. Does anyone have any ideas for me? My current problem today is, yet again, I need a web site to connect to an Access database on a file share. Here we go again...

    Read the article

  • Load and Web Performance Testing using Visual Studio Ultimate 2010-Part 2

    - by Tarun Arora
    Welcome back, in part 1 of Load and Web Performance Testing using Visual Studio 2010 I talked about why Performance Testing the application is important, the test tools available in Visual Studio Ultimate 2010 and various test rig topologies. In this blog post I’ll get into the details of web performance & load tests as well as why it’s important to follow a goal based pattern while performance testing your application. Tools => Options => Test Tools Have you visited the treasures of Visual Studio Menu bar tools => Options => Test Tools lately? The options to enable disable prompts on creating, editing, deleting or running manual/automated tests can be controller from here. The default test project language and default test types created on a new test project creation could be selected/unselected from here. Ever wondered how you can change the default limit of 25 test results, this can again be changed from here. If you record a lot of Web Tests and wish for the web test recorder to start with “that” URL populated, well this again can be specified from here. If you haven’t so far, I would urge you to spend 2 minutes in the test tools options.   Test Menu => Ready Steady Test Action! The Test tools are under the Test Menu in Visual Studio, apart from being able to create a new Test and Test List you can also load an existing vsmdi file. You can also manage your test controllers from here. A solution can have one or more test setting files, but there can only be one active test settings file at any time. Again, this selection can be done from here.  You can open the various test windows from under the windows option from the test menu. If you open the Test view window you will see that you have the option to group the tests by work items, project, test type, etc. You can set these properties by right clicking a test in the test list and choosing properties from the context menu.    So, what is a vsmdi file? vsmdi stands for Visual Studio Test Metadata File. Placed under the Solution Items this file keeps track of the list of unit tests in your solution. If you open the vsmdi file as an xml file you will see a series of Test Links nested with in the list Test List tags along with the Run Configuration tag. When in visual studio you run tests, the IDE looks at the vsmdi file to see what tests need to be run. You also have the option of using the vsmdi file in your team builds to specify which tests need to run as part of the build. Refer here for a walkthrough from a fellow blogger on how to use the vsmdi file in the team builds. Web Performance Test – The Truth! In Visual Studio 2010 “Web Tests” have been renamed to “Web Performance Tests”. Apart from renaming this test type there have been several improvements to this test type in visual studio 2010. I am very active on the MSDN Visual Studio And Load Testing forum and a frequent question from many users is “Do Web Tests support Pages that run JavaScript?” I will start with a little bit of background before answering this question. Web Performance Tests operate at the HTTP Layer, but why? To enable you to generate high loads with a relatively low amount of hardware, Web performance tests are driven at the protocol layer rather than instantiating a browser.The most common source of confusion is that users do not realize Web Performance Tests work at the HTTP layer. The tool adds to that misconception. After all, you record in IE, and when running a Web test you can select which browser to use, and then the result viewer shows the results in a browser window. So that means the tests run through the browser, right? NO! The Web test engine works at the HTTP layer, and does not instantiate a browser. What does that mean? In the diagram below, you can see there are no browsers running when the engine is sending and receiving requests. Does that mean I can’t test pages that use Java script? The best example for java script generating HTTP traffic is AJAX calls. The most common example of browser plugins are Silverlight or Flash. The Web test recorder will record HTTP traffic from AJAX calls and from most (but not all) browser plugins. This means you will still be able to web performance test pages that use java script or plugin and play back the results but the playback engine will not show the java script or plug in results in the ‘browser control’. If you want to test the page behaviour as a result of the java script or plug in consider using Coded UI Tests. This page looks like it failed, when in fact it succeeded! Looking closely at the response, and subsequent requests, it is clear the operation succeeded. As stated above, the reason why the browser control is pasting this message is because java script has been disabled in this control. So, to reiterate, the web performance test recorder: - Sends and receives data at the HTTP layer. - Does NOT run a browser. - Does NOT run java script. - Does NOT host ActiveX controls or plugins. There is a great series of blog posts from Ed Glas, i would highly recommend his blog to any one performing Load/Performance testing through Visual Studio. Demo – Web Performance Test [Demo] - Visual Studio Ultimate 2010: Test Settings and Configuration   [Demo]–Visual Studio Ultimate 2010: Web Performance Test   In this short video I try and answer the following questions, Why is performance Testing important? How does Visual Studio Help you performance Test your applications? How do i record a web performance test? How do make a web performance test data driven, transaction driven, loop driven, convert to code, add validations? Best practices for recording Web Performance Tests. I have a web performance test, what next? Creating the Web Performance Test was the first step towards load testing your application. Now that we have the base test we can test the page behaviour when N-users access the page. Have you ever had the head of business call you and mention that the marketing team has done a fantastic job and are expecting increased traffic on the web site, can the website survive the weekend with that additional load? This is the perfect opportunity to capacity test your application to see how your website holds up under various levels of load, you can work the results backwards to see how much hardware you may need to scale up your application to survive the weekend. Apart from that it is always a good idea to have some benchmarks around how the application performs under light loads for short duration, under heavy load for long duration and soak test the application run a constant load for a very week or two to record the effects of constant load for really long durations, this is a great way of identifying how your application handles the default IIS application pool reset which by default is configured to once every 25 hours. These bench marks will act as the perfect yard stick to measure performance gains when you start making improvements. BUT there are some best practices! => Goal Based Load Testing Approach Since the subject is vast and there are a lot of things to measure and analyse, … it is very easy to get distracted from the real goal!  You can optimize your application once you know where the pain points are. There is no point performing a load test of 5000 users if your intranet application will only have a 100 simultaneous users, it is important to keep focussed on the real goals of the project. So the idea is to have a user story around your load testing scenarios and test realistically. So it is recommended that you follow the below outline, It is an Iterative process, refine your objectives, identify the key scenarios, what is the expected workload, key metrics you want to report, record the web performance tests, simulate load and analyse results. Is your application already deployed in Production? This is great! You can analyse the IIS Logs to understand the user behaviour… But what are IIS LOGS? The IIS logs allow you to record events for each application and Web site on the Web server. You can create separate logs for each of your applications and Web sites. Logging information in IIS goes beyond the scope of the event logging or performance monitoring features provided by Windows. The IIS logs can include information, such as who has visited your site, what the visitor viewed, and when the information was last viewed. You can use the IIS logs to identify any attempts to gain unauthorized access to your Web server. How to configure IIS LOGS? For those Ninjas who already have IIS Logs configured (by the way its on by default) and need a way to analyse the IIS Logs, can use the Windows IIS Utility – Log Parser. Log Parser is a very powerful tool that provides a generic SQL-like language on top of many types of data like IIS Logs, Event Viewer entries, XML files, CSV files, File System and others; and it allows you to export the result of the queries to many output formats such as CSV, XML, SQL Server, Charts and others; and it works well with IIS 5, 6, 7 and 7.5. Frequently used Log Parser queries. Demo – Load Test [Demo]–Visual Studio Ultimate 2010: Load Testing   In this short video I try and answer the following questions, - Types of Performance Testing? - Perform Goal driven Load Testing, analyse Test Run Result and Generate a report? Recap A quick recap of what we have covered so far,     Thank you for taking the time out and reading this blog post, in part III of this blog series I’ll be getting into the details of Test Result Analysis, Test Result Drill through, Test Report Generation, Test Run Comparison, and the Asp.net Profiler. If you enjoyed the post, remember to subscribe to http://feeds.feedburner.com/TarunArora. Questions/Feedback/Suggestions, etc please leave a comment. See you on in Part III   Share this post : CodeProject

    Read the article

  • Report Viewer problem after moving to new server

    - by SystemicPlural
    I have just moved a site from a Windows 2003, IIS6 SQL 2005 server to a new one with Windows 2008, IIS7 and SQL 2008. I am having problems with the Report Viewer. I have installed the Report Viewer Re-distributable (I've tried 2005, 2005sp, 2008 and 2008sp) I've Mapped a handler in IIS for Reserved.ReportViewerWebControl.axd to type Microsoft.Reporting.WebForms.HttpHandler, Microsoft.ReportViewer.WebForms, Version=8.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a However whenever I run a report on the website I get the following error message: Could not load type 'Microsoft.Reporting.Microsoft.Reporting.WebForms.HttpHandler' from assembly 'Microsoft.ReportViewer.WebForms, Version=9.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a'. Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. Exception Details: System.TypeLoadException: Could not load type 'Microsoft.Reporting.Microsoft.Reporting.WebForms.HttpHandler' from assembly 'Microsoft.ReportViewer.WebForms, Version=9.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a'. I am stumped. Any ideas?

    Read the article

  • The fastest way to resize images from ASP.NET. And it’s (more) supported-ish.

    - by Bertrand Le Roy
    I’ve shown before how to resize images using GDI, which is fairly common but is explicitly unsupported because we know of very real problems that this can cause. Still, many sites still use that method because those problems are fairly rare, and because most people assume it’s the only way to get the job done. Plus, it works in medium trust. More recently, I’ve shown how you can use WPF APIs to do the same thing and get JPEG thumbnails, only 2.5 times faster than GDI (even now that GDI really ultimately uses WIC to read and write images). The boost in performance is great, but it comes at a cost, that you may or may not care about: it won’t work in medium trust. It’s also just as unsupported as the GDI option. What I want to show today is how to use the Windows Imaging Components from ASP.NET APIs directly, without going through WPF. The approach has the great advantage that it’s been tested and proven to scale very well. The WIC team tells me you should be able to call support and get answers if you hit problems. Caveats exist though. First, this is using interop, so until a signed wrapper sits in the GAC, it will require full trust. Second, the APIs have a very strong smell of native code and are definitely not .NET-friendly. And finally, the most serious problem is that older versions of Windows don’t offer MTA support for image decoding. MTA support is only available on Windows 7, Vista and Windows Server 2008. But on 2003 and XP, you’ll only get STA support. that means that the thread safety that we so badly need for server applications is not guaranteed on those operating systems. To make it work, you’d have to spin specialized threads yourself and manage the lifetime of your objects, which is outside the scope of this article. We’ll assume that we’re fine with al this and that we’re running on 7 or 2008 under full trust. Be warned that the code that follows is not simple or very readable. This is definitely not the easiest way to resize an image in .NET. Wrapping native APIs such as WIC in a managed wrapper is never easy, but fortunately we won’t have to: the WIC team already did it for us and released the results under MS-PL. The InteropServices folder, which contains the wrappers we need, is in the WicCop project but I’ve also included it in the sample that you can download from the link at the end of the article. In order to produce a thumbnail, we first have to obtain a decoding frame object that WIC can use. Like with WPF, that object will contain the command to decode a frame from the source image but won’t do the actual decoding until necessary. Getting the frame is done by reading the image bytes through a special WIC stream that you can obtain from a factory object that we’re going to reuse for lots of other tasks: var photo = File.ReadAllBytes(photoPath); var factory = (IWICComponentFactory)new WICImagingFactory(); var inputStream = factory.CreateStream(); inputStream.InitializeFromMemory(photo, (uint)photo.Length); var decoder = factory.CreateDecoderFromStream( inputStream, null, WICDecodeOptions.WICDecodeMetadataCacheOnLoad); var frame = decoder.GetFrame(0); We can read the dimensions of the frame using the following (somewhat ugly) code: uint width, height; frame.GetSize(out width, out height); This enables us to compute the dimensions of the thumbnail, as I’ve shown in previous articles. We now need to prepare the output stream for the thumbnail. WIC requires a special kind of stream, IStream (not implemented by System.IO.Stream) and doesn’t directlyunderstand .NET streams. It does provide a number of implementations but not exactly what we need here. We need to output to memory because we’ll want to persist the same bytes to the response stream and to a local file for caching. The memory-bound version of IStream requires a fixed-length buffer but we won’t know the length of the buffer before we resize. To solve that problem, I’ve built a derived class from MemoryStream that also implements IStream. The implementation is not very complicated, it just delegates the IStream methods to the base class, but it involves some native pointer manipulation. Once we have a stream, we need to build the encoder for the output format, which could be anything that WIC supports. For web thumbnails, our only reasonable options are PNG and JPEG. I explored PNG because it’s a lossless format, and because WIC does support PNG compression. That compression is not very efficient though and JPEG offers good quality with much smaller file sizes. On the web, it matters. I found the best PNG compression option (adaptive) to give files that are about twice as big as 100%-quality JPEG (an absurd setting), 4.5 times bigger than 95%-quality JPEG and 7 times larger than 85%-quality JPEG, which is more than acceptable quality. As a consequence, we’ll use JPEG. The JPEG encoder can be prepared as follows: var encoder = factory.CreateEncoder( Consts.GUID_ContainerFormatJpeg, null); encoder.Initialize(outputStream, WICBitmapEncoderCacheOption.WICBitmapEncoderNoCache); The next operation is to create the output frame: IWICBitmapFrameEncode outputFrame; var arg = new IPropertyBag2[1]; encoder.CreateNewFrame(out outputFrame, arg); Notice that we are passing in a property bag. This is where we’re going to specify our only parameter for encoding, the JPEG quality setting: var propBag = arg[0]; var propertyBagOption = new PROPBAG2[1]; propertyBagOption[0].pstrName = "ImageQuality"; propBag.Write(1, propertyBagOption, new object[] { 0.85F }); outputFrame.Initialize(propBag); We can then set the resolution for the thumbnail to be 96, something we weren’t able to do with WPF and had to hack around: outputFrame.SetResolution(96, 96); Next, we set the size of the output frame and create a scaler from the input frame and the computed dimensions of the target thumbnail: outputFrame.SetSize(thumbWidth, thumbHeight); var scaler = factory.CreateBitmapScaler(); scaler.Initialize(frame, thumbWidth, thumbHeight, WICBitmapInterpolationMode.WICBitmapInterpolationModeFant); The scaler is using the Fant method, which I think is the best looking one even if it seems a little softer than cubic (zoomed here to better show the defects): Cubic Fant Linear Nearest neighbor We can write the source image to the output frame through the scaler: outputFrame.WriteSource(scaler, new WICRect { X = 0, Y = 0, Width = (int)thumbWidth, Height = (int)thumbHeight }); And finally we commit the pipeline that we built and get the byte array for the thumbnail out of our memory stream: outputFrame.Commit(); encoder.Commit(); var outputArray = outputStream.ToArray(); outputStream.Close(); That byte array can then be sent to the output stream and to the cache file. Once we’ve gone through this exercise, it’s only natural to wonder whether it was worth the trouble. I ran this method, as well as GDI and WPF resizing over thirty twelve megapixel images for JPEG qualities between 70% and 100% and measured the file size and time to resize. Here are the results: Size of resized images   Time to resize thirty 12 megapixel images Not much to see on the size graph: sizes from WPF and WIC are equivalent, which is hardly surprising as WPF calls into WIC. There is just an anomaly for 75% for WPF that I noted in my previous article and that disappears when using WIC directly. But overall, using WPF or WIC over GDI represents a slight win in file size. The time to resize is more interesting. WPF and WIC get similar times although WIC seems to always be a little faster. Not surprising considering WPF is using WIC. The margin of error on this results is probably fairly close to the time difference. As we already knew, the time to resize does not depend on the quality level, only the size does. This means that the only decision you have to make here is size versus visual quality. This third approach to server-side image resizing on ASP.NET seems to converge on the fastest possible one. We have marginally better performance than WPF, but with some additional peace of mind that this approach is sanctioned for server-side usage by the Windows Imaging team. It still doesn’t work in medium trust. That is a problem and shows the way for future server-friendly managed wrappers around WIC. The sample code for this article can be downloaded from: http://weblogs.asp.net/blogs/bleroy/Samples/WicResize.zip The benchmark code can be found here (you’ll need to add your own images to the Images directory and then add those to the project, with content and copy if newer in the properties of the files in the solution explorer): http://weblogs.asp.net/blogs/bleroy/Samples/WicWpfGdiImageResizeBenchmark.zip WIC tools can be downloaded from: http://code.msdn.microsoft.com/wictools To conclude, here are some of the resized thumbnails at 85% fant:

    Read the article

  • ASP.NET MVC2 Data Access Layer

    - by Paul
    For a small/medium sized project I'm trying to figure out what is the 'ideal' way to have a domain layer and data access layer. My opinions on coupling tend to be more towards the view that the domain models should not be tightly coupled with the database layer, in other words the data access layer shouldn't actually know anything about the domain objects. I've been looking at Linq-to-sql and it wants to use its own models that it creates, and so it ends up VERY tightly coupled. Whilst I love the way you use linq-to-sql in code I really don't like the way it wants to make its own domain objects. What are some alternatives that I should consider? I tried use NHibernate but I did not like the way I had to use to query and get different objects. I honestly love the syntax and way you use linq, I just don't want it to be so tightly coupled to domain objects.

    Read the article

  • PayPal IPN - having trouble accessing session data?

    - by Martin Bean
    Hello, all. I'm having issues with PayPal IPN integration where it seems I cannot get my solution to read session variables. Basically, in my shop module script, I store the customer's details as provided by PayPal to an orders table. However, I also wish to save products ordered in a transaction to a separate table linked by the order ID. However, it's the second part of the script that's not working, where I loop through the products in the session and then save them to the orders_products table. Is there a reason why the session data not being read? The code within shop.php is as follows: if ($paypal->validate_ipn()) { $name = $paypal->ipn_data['address_name']; $street_1 = $paypal->ipn_data['address_street']; $street_2 = ""; $city = $paypal->ipn_data['address_city']; $state = $paypal->ipn_data['address_state']; $zip = $paypal->ipn_data['address_zip']; $country = $paypal->ipn_data['address_country']; $txn_id = $paypal->ipn_data['txn_id']; $sql = "INSERT INTO orders (name, street_1, street_2, city, state, zip, country, txn_id) VALUES (:name, :street_1, :street_2, :city, :state, :zip, :country, :txn_id)"; $smt = $this->pdo->prepare($sql); $smt->bindParam(':name', $name, PDO::PARAM_STR); $smt->bindParam(':street_1', $street_1, PDO::PARAM_STR); $smt->bindParam(':street_2', $street_2, PDO::PARAM_STR); $smt->bindParam(':city', $city, PDO::PARAM_STR); $smt->bindParam(':state', $state, PDO::PARAM_STR); $smt->bindParam(':zip', $zip, PDO::PARAM_STR); $smt->bindParam(':country', $country, PDO::PARAM_STR); $smt->bindParam(':txn_id', $txn_id, PDO::PARAM_INT); $smt->execute(); // save products to orders relationship $order_id = $this->pdo->lastInsertId(); // $cart = $this->session->get('cart'); $cart = $this->session->get('cart'); foreach ($cart as $product_id => $item) { $quantity = $item['quantity']; $sql = "INSERT INTO orders_products (order_id, product_id, quantity) VALUES ('$order_id', '$product_id', '$quantity')"; $res = $this->pdo->query($sql); } $this->session->del('cart'); mail('[email protected]', 'IPN result', 'IPN was successful on wrestling-wear.com'); } else { mail('[email protected]', 'IPN result', 'IPN failed on wrestling-wear.com'); } And I'm using the PayPal IPN class for PHP as found here: http://www.micahcarrick.com/04-19-2005/php-paypal-ipn-integration-class.html, but the contents of the validate_ipn() method is as follows: public function validate_ipn() { $url_parsed = parse_url($this->paypal_url); $post_string = ''; foreach ($_POST as $field => $value) { $this->ipn_data[$field] = $value; $post_string.= $field.'='.urlencode(stripslashes($value)).'&'; } $post_string.= "cmd=_notify-validate"; // append IPN command // open the connection to PayPal $fp = fsockopen($url_parsed[host], "80", $err_num, $err_str, 30); if (!$fp) { // could not open the connection. If logging is on, the error message will be in the log $this->last_error = "fsockopen error no. $errnum: $errstr"; $this->log_ipn_results(false); return false; } else { // post the data back to PayPal fputs($fp, "POST $url_parsed[path] HTTP/1.1\r\n"); fputs($fp, "Host: $url_parsed[host]\r\n"); fputs($fp, "Content-type: application/x-www-form-urlencoded\r\n"); fputs($fp, "Content-length: ".strlen($post_string)."\r\n"); fputs($fp, "Connection: close\r\n\r\n"); fputs($fp, $post_string . "\r\n\r\n"); // loop through the response from the server and append to variable while (!feof($fp)) { $this->ipn_response.= fgets($fp, 1024); } fclose($fp); // close connection } if (eregi("VERIFIED", $this->ipn_response)) { // valid IPN transaction $this->log_ipn_results(true); return true; } else { // invalid IPN transaction; check the log for details $this->last_error = 'IPN Validation Failed.'; $this->log_ipn_results(false); return false; } }

    Read the article

  • ExecuteStoreQuery passing parameter to StoreProcedure

    - by KDM
    Could someone help with passing a parameter object into my ExecuteStoreQuery. Im executing this from my Entities db and then creating my procedure on the fly is this possible and is my SQL correct. i need to pass a Id paramater value into my sql statement var node = db.ExecuteStoreQuery<Node>(@" @Id int // not sure about this declaration for paramater with c as ( select Id, Name, ParentId,[0] as level from Department d union all select d.Id, d.Name, d.ParentId, [level] + 1 from Department d join c on d.ParentId = c.Id) select * from c where c.Id = @Id" "Departments", System.Data.Objects.MergeOption.NoTracking, new object{ Id: department.Id} // <--paramater object not working);

    Read the article

  • Hibernates generates VARBINARY column with JPA @Enumerated annotation

    - by tran
    Hibernates is generating a VARBINARY column with JPA @Enumerated annotation. I'm using SQL Server 2005 w/ JTDS driver. The mapping is dirt simple: @Basic @Enumerated(EnumType.ORDINAL) private MyEnum foo; I would have expected Hibernate to generate an integer column? I've also tried EnumType.STRING (expecting a varchar column) with no success. Note that my app works fine; the column type makes it hard to inspect the DB, and to issue adhoc SQL when poking at the database.

    Read the article

  • So&hellip; What is a SharePoint Developer?

    - by Mark Rackley
    A few days ago Stacy Draper and I were chatting about what it means to be a SharePoint Developer. That actually turns about to be a conversation with lots of shades of grey. Stacy thought it would make a good blog post… well, I can’t promise this to be a GOOD blog post… So, anyway, I decided to let off a little bomb this morning by posting the following tweet on Twitter: @mrackley: Can someone be considered a SharePoint Developer if all they know how to do is work in SPD? Now, I knew this is a debate that has been going on since the first SharePoint Designer User put SharePoint Developer on their resume. There are probably several blogs out there on the subject, but with the wildfire that is jQuery and a few other new features out there I believe it is an important subject to tackle again. I got a lot of great feedback as well on Twitter. The entire twitter conversation is at the end of this blog posting. Thanks everyone for their opinions. Who cares? Why does it matter? Can’t we all just get along? Yes it matters… everything must be labeled and put in it’s proper place. Pigeon holing is the only way to go!  Just kidding.. I’m not near that anal, but yes! It is important to be able to properly identify the skill set of those people on your team and correctly identify the role you are wanting to hire. Saying you are a “SharePoint Developer” is just too vague and just barely begins to answer the question. Also, knowing who’s on your team and what they can do will ensure you give your clients the best people for the job. A Developer writes code right? So, a Developer uses Visual Studio! Whoa, hold on there Sparky. Even if I concede that to be a developer you have to write code then you still can’t say a SharePoint Developer has to use Visual Studio.  So, you can spell C#, how well can you write XSLT? How’s your jQuery? Sorry bud, that’s code whether you like it or not. There are many ways to write code in SharePoint that have nothing to do with cracking open Visual Studio. So, what are the different ways to develop in SharePoint then? How many different ways can you “develop” in SharePoint?? A lot… Out of the box features In SharePoint you can create a site, create a custom list on that site, do basic calculations in a calculated column, set up alerts, and add all sorts of web parts to a page. Let’s face it.. that IS development! javaScript/jQuery Perhaps you’ve heard by now about this thing called jQuery? It’s all over the place and the answer to a lot of people’s prayers. However be careful, with great power comes great responsibility. Remember, javaScript is executed on the client side and if you abuse it your performance could be affected. Also, Marc Anderson (@sympmarc) wrote a pretty awesome javaScript library called SPServices.  This allows you to access SharePoint’s Web Services using jQuery. How freakin cool is that? With these tools at your disposal the number of things you CAN’T do without Visual Studio grows smaller and smaller. This is definitely development no matter what anyone else says and there is no Visual Studio involved. SharePoint Designer Ahhh.. The cause of and the answer to all of your SharePoint development problems. With SharePoint Designer you can use DataView Web Parts, develop (there’s that word again) your branding, and even connect to external datasources.  There’s a lot you can do in SharePoint Designer. It’s got it’s shortcomings, but it is an invaluable tool in the SharePoint developers toolbox. InfoPath So, can InfoPath development really be considered SharePoint development? I would say yes. You can connect to SharePoint lists, populate fields in a SharePoint list, and even write code in InfoPath. Sounds like SharePoint development to me. Visual Studio – Web Services/WCF So, get this. You can write code for SharePoint and not have a clue what the 12 hive is, what “site actions” means, or know how to do ANYTHING in SharePoint? Poppycock! You say? SharePoint Web Services I say… With SharePoint Web Services you can totally interact with SharePoint without knowing anything about SharePoint. I don’t recommend it of course, but it’s possible. What can you write using SharePoint Web Services? How about a little application called SharePoint Designer? Visual Studio – Object Model And here we are finally:  the SharePoint Object Model.  When you hear “SharePoint Developer” most people think of someone opening Visual Studio and creating a custom web part, workflow, event receiver, etc.. etc.. but I hope that by now I have made the point that this is NOT the only form of SharePoint Development! Again… Who cares? Just crack open Visual Studio for everything! Problem solved! Let’s ponder for a moment, shall we? The business comes to you with a requirement that involves some pretty fancy business calculations, and a complicated view that they do NOT want to look like SharePoint. “No Problem” you proclaim you mighty SharePoint Developer. You go back to your cube, chuckle at the latest Dilbert comic, and crack open Visual Studio. Then you build your custom web part… fight with all the deployment, migration, and UAT that you must go through and proclaim victory two weeks later!!!! Well done my good sir/ma’am! Oh wait… it turns out Sally who is not a “developer” did the exact same thing with a Dataview web part and some jQuery and it’s been in production for two weeks? #CockinessFail I know there are many ASP.NET developers out there that can create a custom control and wrap it to be a SharePoint Web Part.  That does NOT mean they are SharePoint Developers though as far as I’m concerned and I personally would much rather have someone on my team that can manipulate the heck (yes, I said ‘heck’) out of SharePoint using Dataview Web Parts, jQuery, and a roll of duct tape. Just because you know how to write code in Visual Studio does not mean you are a SharePoint Developer. What’s the conclusion here? How do we define ‘it’ and what ‘it’ is called? Fortunately, this is MY blog. I don’t have to give answers, I can stir the pot, laugh and leave you to ponder what it means! There is obviously no right or wrong answer here (unless you disagree with me,then you are flat out wrong). Anyway, there are many opinions.  Here’s mine.  If you put SharePoint Developer on your resume make sure to clearly specify HOW you develop in SharePoint and what tools you use. If we must label these gurus of jQuery and SPD, how about “SharePoint Client Developer” or “SharePoint Front End Developer”? Just throwing out an idea. Whatever we call them, to say they are not developers is short-sighted, arrogant, and unfair. Of course, then we need to figure out what to call all those other SharePoint development types.  Twitter Conversation @next_connect: RT @mrackley: Can someone be considered a SharePoint Developer if all they know how to do is work in SPD? | I say no.... @mikegil:  @mrackley re: yr Developer question: SPD expert <> SP Developer. Can be "sous-developer," though. #SharePoint #SPD @WonderLaura:  Rt @mrackley Can someone be considered a SharePoint Dev if all they know how to do is work in SPD? -- My opinion is that devs write code. @exnav29:  Rt @mrackley Can someone be considered a SharePoint Dev if all they know how to do is work in SPD? => I think devs would use VS as well @ssKevin:  @WonderLaura @mrackley does that mean strictly vb and c# when it comes to #SharePoint ? @jimmywim:  @exnav29 @mrackley nah, I'd say they were a power user. Devs know their way around the 12 hive ;) @sympmarc:  RT @mrackley: Can someone be considered a SharePoint Developer if all they know how to do is work in SPD? -> Fighting words. @sympmarc:  @next_connect @mrackley Besides, we prefer to be called "hacks". ;+) @next_connect:  @sympmarc The important thing is that you don't have to develop code to solve problems and create solutions. @mrackley @mrackley:  @sympmarc @next_connect not tryin to pick fight.. just try and find consensus on definition @usher:  @mrackley I'd still argue that you have a DevLite title that's out there for the collaboration engineers (@sympmarc @next_connect) @next_connect: @usher I agree. I've called it Light Dev/ Configuration before. @sympmarc @mrackley @usher:  @next_connect I like DevLite, low calorie but still same great taste :) @mrackley @sympmarc @mrackley:  @next_connect @usher @sympmarc I don't think there's any "lite" to someone who can bend jQuery and XSLT to their will. @usher:  @mrackley okay, so would you refer to someone that writes user controls and assemblies something different (@next_connect @sympmarc) @usher:  @mrackley when looking for a developer that can write .net code, it's a bit different than an XSLT/jQuery designer. @sympmarc @next_connect @jimmywim:  @mrackley @sympmarc @next_connect I reckon a "dev" does managed code and works in the 12 hive @sympmarc:  @jimmywim @mrackley @next_connect We had a similar debate a few days ago @toddbleeker et al @sympmarc:  @sympmarc @jimmywim @mrackley @next_connect @toddbleeker @stevenmfowler More abt my Middle Tier term, but still connected. Meet bus need. @toddbleeker:  @sympmarc @jimmywim @mrackley @next_connect I used "No Assembly Required" in the past. I also suggested "Supplimenting the SharePoint DOM" @toddbleeker:  @sympmarc @jimmywim @mrackley @next_connect Others suggested Information Worker Solutions/Enhancements @toddbleeker:  @sympmarc @jimmywim @mrackley @next_connect @stevenmfowler I also like "SharePoint Scripting Solutions". All the technologies are script. @jimmywim:  @toddbleeker @sympmarc @mrackley @next_connect I like the IW solutions one... @toddbleeker:  @sympmarc @jimmywim @mrackley @next_connect @stevenmfowler This is like the debate that never ends: it is definitely not called Middle Tier. @jimmywim:  @toddbleeker @sympmarc @mrackley @next_connect @stevenmfowler "Scripting" these days makes me think PowerShell... @sympmarc:  @toddbleeker @jimmywim @mrackley @next_connect @stevenmfowler If it forces a debate on h2 best solve bus probs, I'll keep sayin Middle Tier. @usher:  @sympmarc so we know what we're looking for, we just can't define a name? @toddbleeker @jimmywim @mrackley @next_connect @stevemfowler @sympmarc:  @usher @sympmarc @toddbleeker @jimmywim @mrackley @next_connect @stevemfowler The naming seems to matter more than the substance. :-( @jimmywim:  @sympmarc @usher @toddbleeker @mrackley @next_connect @stevemfowler work brkdn defines tasks, defines tools needed, can then b grp'd by user @WonderLaura:  @mrackley @toddbleeker @jimmywim @sympmarc @usher @next_connect Funny you're asking. @johnrossjr and I spent hours this week on the subject. @stevenmfowler:  RT @toddbleeker: @sympmarc @jimmywim @mrackley @next_connect @stevenmfowler it is definitely not called Middle Tier. < I'm with Todd

    Read the article

  • Java - When to use Iterators?

    - by Walter White
    Hi all, I am trying to better understand when I should and should not use Iterators. To me, whenever I have a potentially large amount of data to iterate through, I write an Iterator for it. If it also lends itself to the Iterator interface, then it seems like a win. I was reading a little bit that there is a lot of overhead with using an Iterator. A good example of where I used an Iterator was to iterate through a bunch of SQL scripts to execute one query at a time, reading it in, then executing it. Is there another performance trade off I should be aware of? Before I used iterators, I would read the entire String of SQL commands to execute into an ArrayList, and the iterate through that. If the import is rather large (like for geolocation data, then the server tends to get bogged down). Walter

    Read the article

  • UDF call in entity framework is cached

    - by Fred Yang
    I am doing a test after reading an article http://blogs.msdn.com/alexj/archive/2009/08/07/tip-30-how-to-use-a-custom-store-function.aspx about udf function called. When I use a function with objectContext.Entities.Where( t= udf(para1, para2) == 1), here the Entities is not ObjectQuery, but an ObjectSet, the first time I call the method, it runs correctly, if I reuse the objectContext,and run it again but with different para1, para2, then the previous parameter values still cached, and the result is same as previous one, which is wrong. The sql profiler shows that both query hit the database, but the t-sql is the same. Am I missing something? And the ObjectSet does not support .where(esql_string). How to get udf working with ObjectSet? Thanks Fred

    Read the article

  • Php Framework Advice

    - by gnomixa
    I am looking for a lightweight php framework with the following qualifications: ability to write my own sql queries ( i simply don't trust CakePHP like method where the framework does your sql for you); ability to integrate Jquery easily; built-in templating, or relatively easy to introduce Smarty (or another templating engine) into it; MVC; fast Any advice/comparison? i have looked into CodeIgniter, Symfony and CakePHP so far. Symfony is slow, and CakePHP is too inaccessible ...so far my choice would be CodeIgniter. I played with it a bit, but i would like to hear more experiences. I am looking for a framework that will "enforce" organization of my app in a logical way - MVC seems like a great choice.

    Read the article

  • Net::Cassandra::Easy equivalent of "SELECT * FROM ..."

    - by knorv
    When using Perl's Net::Cassandra::Easy the following code will retrieve columns col[1-3] from rows row[1-3]: $result = $cassandra->get(['row1', 'row2', 'row3'], family => 'Standard1', byname => ['col1', 'col2', 'col3'); The corresponding SQL would be: SELECT col1, col2, col3 FROM rows WHERE id IN ('row1', 'row2', 'row3'); Suppose instead that I want to retrieve all columns. In SQL terms that would be: SELECT * FROM rows WHERE id IN ('row1', 'row2', 'row3'); To get all columns I am currently using: $result = $cassandra->get(['row1', 'row2', 'row3'], family => 'Standard1', byoffset => { "count" => 1_000_000 }); This works as long as the number of columns does not exceed one million. While this works I'd assume that there is a cleaner way to do it. My question is: Is there any cleaner way to specify to Cassandra that I want to retrieve all columns for the maching rows?

    Read the article

  • Recordset Paging works in IIS6 but not in IIIS7

    - by Spudhead
    Hi there, I've got a recordset/paging set up - works fine in IIS6 but when I run the site on an IIS7 server I get the following error: Microsoft OLE DB Provider for SQL Server error '80004005' [DBNETLIB][ConnectionOpen (Connect()).]SQL Server does not exist or access denied. /orders.asp, line 197 the code looks like this: Set objPagingConn = Server.CreateObject("ADODB.Connection") objPagingConn.Open CONN_STRING Set objPagingRS = Server.CreateObject("ADODB.Recordset") objPagingRS.PageSize = iPageSize objPagingRS.CacheSize = iPageSize objPagingRS.Open strSQL, objPagingConn, adOpenStatic, adLockReadOnly, adCmdText iPageCount = objPagingRS.PageCount iRecordCount = objPagingRS.RecordCount Line 197 is the objPagingConn,Open ... line. I've got about 10 sites like this to migrate - is there a simple fix in IIS7??? Help is greatly appreciated! Many thanks, Martin

    Read the article

  • Run a shell command with arguments from powershell script

    - by Mike Weerasinghe
    Hello, I need to extract and save a some tables from a remote SQL database using bcp. I would like to write a powershell script to invoke bcp for each table and save the data. So far I have this script that creates the necessary args for bcp. However I can not figure out how to pass the args to bcp. Every time I run the script it just shows the bcp help instead. This must be something really easy that I am not getting. #commands bcp database.dbo.tablename out c:\temp\users.txt -N -t, -U uname -P pwd -S <servername> $bcp_path = "C:\Program Files\Microsoft SQL Server\90\Tools\Binn\bcp.exe" $serverinfo =@{} $serverinfo.add('table','database.dbo.tablename') $serverinfo.add('uid','uname') $serverinfo.add('pwd','pwd') $serverinfo.add('server','servername') $out_path= "c:\Temp\db\" $args = "$($serverinfo['table']) out $($out_path)test.dat -N -t, -U $($serverinfo['uid']) -P $($serverinfo['pwd']) -S $($serverinfo['server'])" #this is the part I can't figure out & $bcp_path $args

    Read the article

  • JDBC resultset close

    - by KM
    I am doing profiling of my Java application and found some interesting statistics for a jdbc PreparedStatement call: Given below is the environment details: Database: Sybase SQL Anywhere 10.0.1 Driver: com.sybase.jdbc3.jdbc.SybDriver connection pool: c3p0 JRE: 1.6.0_05 The code in question is given below: try { ps = conn.prepareStatement(sql); ps.setDouble(...); rs = ps.executeQuery(); ...... return xyz; } finally { try { if (rs != null) rs.close(); if (ps != null) ps.close(); } catch (SQLException sqlEx) { } } From JProfiler stats, I find that this particular resultspace.close() statement alone takes a large amount of time. It varies from 25 ms to 320s while for other code blocks which are of identical in nature, i find that this takes close to 20 microseconds. Just to be sure, I ran this performance test multiple times and confirmed this data. I am puzzled by this behaviour - Thoughts?

    Read the article

  • How do I put array into columns

    - by mathew
    $db = mysql_connect("", "", "") or die("Could not connect."); mysql_select_db("",$db)or die(mysql_error()); $sql = "SELECT * FROM table where 1"; $pager = new pager($sql,'page',6); while($row = mysql_fetch_array($pager->result)) { echo $row['persons']."<br>"; } mysql_close($db); above code output : Mathew Thomas John Stewart Watson Kelvin What I need is it should split inot multiple columns say: Mathew Stewart Thomas Watson John Kelvin HOw do I do this??

    Read the article

  • How to write Asynchronous LINQ query?

    - by Morgan Cheng
    After I read a bunch of LINQ related stuff, I suddenly realized that no articles introduce how to write asynchronous LINQ query. Suppose we use LINQ to SQL, below statement is clear. However, if the SQL database responds slowly, then the thread using this block of code would be hindered. var result = from item in Products where item.Price > 3 select item.Name; foreach (var name in result) { Console.WriteLine(name); } Seems that current LINQ query spec doesn't provide support to this. Is there any way to do asynchronous programming LINQ? It works like there is a callback notification when results are ready to use without any blocking delay on I/O.

    Read the article

  • Web publishing system with code highlighting

    - by Dragos Toader
    I'd like to publish some of the many programs I've written on the web. Is there a syntax highlighting Linux web publishing application (CMS/Blog/RoR app) that displays syntax for C++, Python, Bash scripts, SQL, VBA, awk, Erlang, java, makefiles, Ruby, Pascal and other languages? The more syntax settings configuration files, the better. The extensions I have in Textpad (for which I have syntax highlighting -- syn files) are .as, .asm, .asp, .awk, .bas, .bat, .c, .conf, .cpp, .cs, .ctl, .dfm, .dsc, .erl, .fnc, .h, .hpp, .inf, .ini, .jav, .java, .mak, .nsh, .nsi, .ora, .pas, .pkb, .pks, .pl, .prc, .py, .reg, .rsp, .sh, .sql, .syn, .tcl, .trg, .vw, .xml, .xsl, .xslfo

    Read the article

  • NpgSQL insert file path containing backslashes "\\"

    - by Chau
    I am trying to create a record containing the path to a file. The insertion is done into a Postgres database where UTF8 is enabled, using the NpqSQL driver. My table definition: CREATE TABLE images ( id serial, file_location character varying NOT NULL ) My SQL statement (boiled down to a minimum): INSERT INTO images (file_location) VALUES (E'\\2010') When using pgAdmin to insert the above statement, it works fine. Using the NpgSQL driver through Visual Studio C#, it fails with this exception: "ERROR: 22021: invalid byte sequence for encoding \"UTF8\": 0x81" Replacing my SQL statement with the following: INSERT INTO images (file_location) VALUES (E'\\a2010') ^ The E is encouraged by pgAdmin when using double backslashes. So a quick recap: Why is NpqSQL stopping my insertion of the \\2010?

    Read the article

  • Microsoft&rsquo;s new technical computing initiative

    - by Randy Walker
    I made a mental note from earlier in the year.  Microsoft literally buys computers by the truckload.  From what I understand, it’s a typical practice amongst large software vendors.  You plug a few wires in, you test it, and you instantly have mega tera tera flops (don’t hold me to that number).  Microsoft has been trying to plug away at their cloud services (named Azure).  Which, for the layman, means Microsoft runs your software on their computers, and as demand increases you can allocate more computing power on the fly. With this in mind, it doesn’t surprise me that I was recently sent an executive email concerning Microsoft’s new technical computing initiative.  I find it to be a great marketing idea with actual substance behind their real work.  From the programmer academic perspective, in college we dreamed about this type of processing power.  This has decades of computer science theory behind it. A copy of the email received.  (note that I almost deleted this email, thinking it was spam due to it’s length) We don't often think about how complex life really is. Take the relatively simple task of commuting to and from work: it is, in fact, a complicated interplay of variables such as weather, train delays, accidents, traffic patterns, road construction, etc. You can however, take steps to shorten your commute - using a good, predictive understanding of a few of these variables. In fact, you probably are already taking these inputs and instinctively building a predictive model that you act on daily to get to your destination more quickly. Now, when we apply the same method to very complex tasks, this modeling approach becomes much more challenging. Recent world events clearly demonstrated our inability to process vast amounts of information and variables that would have helped to more accurately predict the behavior of global financial markets or the occurrence and impact of a volcano eruption in Iceland. To make sense of issues like these, researchers, engineers and analysts create computer models of the almost infinite number of possible interactions in complex systems. But, they need increasingly more sophisticated computer models to better understand how the world behaves and to make fact-based predictions about the future. And, to do this, it requires a tremendous amount of computing power to process and examine the massive data deluge from cameras, digital sensors and precision instruments of all kinds. This is the key to creating more accurate and realistic models that expose the hidden meaning of data, which gives us the kind of insight we need to solve a myriad of challenges. We have made great strides in our ability to build these kinds of computer models, and yet they are still too difficult, expensive and time consuming to manage. Today, even the most complicated data-rich simulations cannot fully capture all of the intricacies and dependencies of the systems they are trying to model. That is why, across the scientific and engineering world, it is so hard to say with any certainty when or where the next volcano will erupt and what flight patterns it might affect, or to more accurately predict something like a global flu pandemic. So far, we just cannot collect, correlate and compute enough data to create an accurate forecast of the real world. But this is about to change. Innovations in technology are transforming our ability to measure, monitor and model how the world behaves. The implication for scientific research is profound, and it will transform the way we tackle global challenges like health care and climate change. It will also have a huge impact on engineering and business, delivering breakthroughs that could lead to the creation of new products, new businesses and even new industries. Because you are a subscriber to executive e-mails from Microsoft, I want you to be the first to know about a new effort focused specifically on empowering millions of the world's smartest problem solvers. Today, I am happy to introduce Microsoft's Technical Computing initiative. Our goal is to unleash the power of pervasive, accurate, real-time modeling to help people and organizations achieve their objectives and realize their potential. We are bringing together some of the brightest minds in the technical computing community across industry, academia and science at www.modelingtheworld.com to discuss trends, challenges and shared opportunities. New advances provide the foundation for tools and applications that will make technical computing more affordable and accessible where mathematical and computational principles are applied to solve practical problems. One day soon, complicated tasks like building a sophisticated computer model that would typically take a team of advanced software programmers months to build and days to run, will be accomplished in a single afternoon by a scientist, engineer or analyst working at the PC on their desktop. And as technology continues to advance, these models will become more complete and accurate in the way they represent the world. This will speed our ability to test new ideas, improve processes and advance our understanding of systems. Our technical computing initiative reflects the best of Microsoft's heritage. Ever since Bill Gates articulated the then far-fetched vision of "a computer on every desktop" in the early 1980's, Microsoft has been at the forefront of expanding the power and reach of computing to benefit the world. As someone who worked closely with Bill for many years at Microsoft, I am happy to share with you that the passion behind that vision is fully alive at Microsoft and is carried out in the creation of our new Technical Computing group. Enabling more people to make better predictions We have seen the impact of making greater computing power more available firsthand through our investments in high performance computing (HPC) over the past five years. Scientists, engineers and analysts in organizations of all sizes and sectors are finding that using distributed computational power creates societal impact, fuels scientific breakthroughs and delivers competitive advantages. For example, we have seen remarkable results from some of our current customers: Malaria strikes 300,000 to 500,000 people around the world each year. To help in the effort to eradicate malaria worldwide, scientists at Intellectual Ventures use software that simulates how the disease spreads and would respond to prevention and control methods, such as vaccines and the use of bed nets. Technical computing allows researchers to model more detailed parameters for more accurate results and receive those results in less than an hour, rather than waiting a full day. Aerospace engineering firm, a.i. solutions, Inc., needed a more powerful computing platform to keep up with the increasingly complex computational needs of its customers: NASA, the Department of Defense and other government agencies planning space flights. To meet that need, it adopted technical computing. Now, a.i. solutions can produce detailed predictions and analysis of the flight dynamics of a given spacecraft, from optimal launch times and orbit determination to attitude control and navigation, up to eight times faster. This enables them to avoid mistakes in any areas that can cause a space mission to fail and potentially result in the loss of life and millions of dollars. Western & Southern Financial Group faced the challenge of running ever larger and more complex actuarial models as its number of policyholders and products grew and regulatory requirements changed. The company chose an actuarial solution that runs on technical computing technology. The solution is easy for the company's IT staff to manage and adjust to meet business needs. The new solution helps the company reduce modeling time by up to 99 percent - letting the team fine-tune its models for more accurate product pricing and financial projections. Our Technical Computing direction Collaborating closely with partners across industry and academia, we must now extend the reach of technical computing even further to help predictive modelers and data explorers make faster, more accurate predictions. As we build the Technical Computing initiative, we will invest in three core areas: Technical computing to the cloud: Microsoft will play a leading role in bringing technical computing power to scientists, engineers and analysts through the cloud. Existing high- performance computing users will benefit from the ability to augment their on-premises systems with cloud resources that enable 'just-in-time' processing. This platform will help ensure processing resources are available whenever they are needed-reliably, consistently and quickly. Simplify parallel development: Today, computers are shipping with more processing power than ever, including multiple cores, but most modern software only uses a small amount of the available processing power. Parallel programs are extremely difficult to write, test and trouble shoot. However, a consistent model for parallel programming can help more developers unlock the tremendous power in today's modern computers and enable a new generation of technical computing. We are delivering new tools to automate and simplify writing software through parallel processing from the desktop... to the cluster... to the cloud. Develop powerful new technical computing tools and applications: We know scientists, engineers and analysts are pushing common tools (i.e., spreadsheets and databases) to the limits with complex, data-intensive models. They need easy access to more computing power and simplified tools to increase the speed of their work. We are building a platform to do this. Our development efforts will yield new, easy-to-use tools and applications that automate data acquisition, modeling, simulation, visualization, workflow and collaboration. This will allow them to spend more time on their work and less time wrestling with complicated technology. Thinking bigger There is so much left to be discovered and so many questions yet to be answered in the fascinating world around us. We believe the technical computing community will show us that we have not seen anything yet. Imagine just some of the breakthroughs this community could make possible: Better predictions to help improve the understanding of pandemics, contagion and global health trends. Climate change models that predict environmental, economic and human impact, accessible in real-time during key discussions and debates. More accurate prediction of natural disasters and their impact to develop more effective emergency response plans. With an ambitious charter in hand, this new team is ready to build on our progress to-date and execute Microsoft's technical computing vision over the months and years ahead. We will steadily invest in the right technologies, tools and talent, and work to bring together the technical computing community. I invite you to visit www.modelingtheworld.com today. We welcome your ideas and feedback. I look forward to making this journey with you and others who want to answer the world's biggest questions, discover solutions to problems that seem impossible and uncover a host of new opportunities to change the world we live in for the better. Bob

    Read the article

< Previous Page | 929 930 931 932 933 934 935 936 937 938 939 940  | Next Page >