Search Results

Search found 34699 results on 1388 pages for 'database backup'.

Page 584/1388 | < Previous Page | 580 581 582 583 584 585 586 587 588 589 590 591  | Next Page >

  • How should I handle "real time" events in an online strategy game?

    - by Hojat Taheri
    Some online strategy games have real time events. For example when you send troops to attack somewhere, the attack happens at the right time in the future. Checking the database again and again to get the list of attacks happening each second would cause heavy load. Is there any technique to achieve this goal? Another example: You want to attack a village 3 hours away, you send troops and the attack occurs 3 hours later. Should there be an script to check the database at each second to run the query at the specified time?

    Read the article

  • Managing multiple references of the same game entity in different places using IDs

    - by vargonian
    I've seen great questions on similar topics, but none that addressed this particular method: Given that I have multiple collections of game entities in my [XNA Game Studio] game, with many entities belonging to multiple lists, I'm considering ways I could keep track of whenever an entity is destroyed and remove it from the lists it belongs to. A lot of potential methods seem sloppy/convoluted, but I'm reminded of a way I've seen before in which, instead of having multiple collections of game entities, you have collections of game entity IDs instead. These IDs map to game entities via a central "database" (perhaps just a hash table). So, whenever any bit of code wants to access a game entity's members, it first checks to see if it's even in the database still. If not, it can react accordingly. Is this a sound approach? It seems that it would eliminate many of the risks/hassles of storing multiple lists, with the tradeoff being the cost of the lookup every time you want to access an object.

    Read the article

  • Why ISVs Run Applications on Oracle SuperCluster

    - by Parnian Taidi-Oracle
    Michael Palmeter, Senior Director Product Development of Oracle Engineered Systems, discusses how ISVs can easily run up to 20x faster, gain 28:1 storage compression, and grow presence in the market all without any changes to their code in this short video. One of the family of Oracle engineered systems products, Oracle SuperCluster provides maximum end-to-end database and application performance with minimal initial and ongoing support and maintenance effort, at the lowest total cost of ownership. Java or enterprise applications running on Oracle Database 11gR2 or higher and Oracle Solaris 11 can run up to 20X faster than traditional platforms on Oracle SuperCluster without any changes to their code.  Large number of customers are consolidating hundreds of their applications and databases on Oracle SuperCluster and are requiring their ISVs to support it. ISVs can become Oracle SuperCluster Ready and Oracle SuperCluster Optimized by joining the Oracle Exastack program. 

    Read the article

  • Perspective in Modeling

    - by drsql
    Your task, model a database that represents a suburban block.  You survey the area, and see the following houses (pictures culled from Wikipedia here ) and So you look at the houses, start modeling roofs, windows, lawn, driveway, mail boxes, porches, etc etc. You get done, and with your 30+ tables you are feeling great, right? I know I would be. “I knocked this out of the park! We can capture everything about these houses.  I…am…a…superhero database modeler,” I think, “I will get a big...(read more)

    Read the article

  • What Counts for a DBA: Humility

    - by drsql
    In football (the American sort, naturally,) there are a select group of players who really hope to never have their names called during the game. They are members of the offensive line, and their job is to protect other players so they can deliver the ball to the goal to score points. When you do hear their name called, it is usually because they made a mistake and the player that they were supposed to protect ended up flat on his back admiring the clouds in the sky instead of advancing towards the goal to scoring point. Even on the rare occasion their name is called for a good reason, it is usually because they were making up for a teammate who had made a mistake and they covered up for them. The role of offensive lineman is a very good analogy for the role of the admin DBA. As a DBA, you are called on to be barely visible and rarely heard, protecting the company data assets tenaciously, even though the enemies to our craft surround us on all sides:. Developers: Cries of ‘foul!’ often ensue when the DBA says that they want data integrity to be stringently enforced and that documentation is needed so they can support systems, mostly because every error occurrence in the enterprise will be initially blamed on the database and fall to the DBA to troubleshoot. Insisting too loudly may bring those cries of ‘foul’ that somewhat remind you of when your 2 year old daughter didn't want to go to bed. The result of this petulance is that the next "enemy" gets involved. Managers: The concerns that motivate DBAs to argue will not excite the kind of manager who gets his technical knowledge from a glossy magazine filled with buzzwords, charts, and pretty pictures. However, the other programmers in the organization will tickle the buzzword void with a stream of new-sounding ideas and technologies constantly, along with warnings that if we did care about data integrity and document things, the budget would explode! In contrast, the arguments for integrity of data and supportability tend to be about as exciting as watching grass grow, and far too many manager types seem to prefer to smoke it than watch it. Packaged Applications: The DBA is rarely given a chance to review a new application that is being demonstrated for the enterprise, and rarer still is the DBA that gets a veto of an application because the database it uses has clearly been created by an architect that won't read a data modeling book because he is already married. More often than not this leads to hours of work for the DBA trying to performance-tune a database with a menagerie of rules that must be followed to stay within the  application support agreement, such as no changing indexes on a third party schema even though there are 10 billion rows instead of the 10 thousand when the system was last optimized. Hardware Failures: Physical disks, networking devices, memory, and backup devices all come with a measure known as ‘mean time before failure’ and it is never listed in centuries or eons. More like years, and the term ‘mean’ indicates that half of the devices are expected to fail before that, which by my calendar means any hour of any day that it wants to fail it will. But the DBA sucks it up and does the task at hand with a humility that makes them nearly invisible to all but the most observant person in the organization. The best DBAs I know are so proactive in their relentless pursuit of perfection that they detect many of the bugs (which they seldom caused) in the system well before they become a problem. In the end the DBA gets noticed for one of same two reasons as the offensive lineman. You make a mistake, like dropping a critical production database that had never been backed up; or when a system crashes for any reason whatsoever and they are on the spot with troubleshooting and system restoration plans that have been well thought out, tested, and tested again. Not because there is any glory in it, but because it is what they do.   Note: The characteristics of the professions referred to in this blog are meant to be overstated stereotypes for humorous effect, and even some DBAs aren't quite this perfect. If you are reading this far and haven’t hand written a 10 page flaming comment about how you are a _______ and you aren’t like this, that is awesome. Not every situation applies to everyone, but if you have never worked with a bad packaged app, a magazine trained manager, programmers that aren’t team players, or hardware that occasionally failed, relax and go have a unicorn sandwich before you wake up.

    Read the article

  • SQL 2014 does data the way developers want

    - by Rob Farley
    A post I’ve been meaning to write for a while, good that it fits with this month’s T-SQL Tuesday, hosted by Joey D’Antoni (@jdanton) Ever since I got into databases, I’ve been a fan. I studied Pure Maths at university (as well as Computer Science), and am very comfortable with Set Theory, which undergirds relational database concepts. But I’ve also spent a long time as a developer, and appreciate that that databases don’t exactly fit within the stuff I learned in my first year of uni, particularly the “Algorithms and Data Structures” subject, in which we studied concepts like linked lists. Writing in languages like C, we used pointers to quickly move around data, without a database in sight. Of course, if we had a power failure all this data was lost, as it was only persisted in RAM. Perhaps it’s why I’m a fan of database internals, of indexes, latches, execution plans, and so on – the developer in me wants to be reassured that we’re getting to the data as efficiently as possible. Back when SQL Server 2005 was approaching, one of the big stories was around CLR. Many were saying that T-SQL stored procedures would be a thing of the past because we now had CLR, and that obviously going to be much faster than using the abstracted T-SQL. Around the same time, we were seeing technologies like Linq-to-SQL produce poor T-SQL equivalents, and developers had had a gutful. They wanted to move away from T-SQL, having lost trust in it. I was never one of those developers, because I’d looked under the covers and knew that despite being abstracted, T-SQL was still a good way of getting to data. It worked for me, appealing to both my Set Theory side and my Developer side. CLR hasn’t exactly become the default option for stored procedures, although there are plenty of situations where it can be useful for getting faster performance. SQL Server 2014 is different though, through Hekaton – its In-Memory OLTP environment. When you create a table using Hekaton (that is, a memory-optimized one), the table you create is the kind of thing you’d’ve made as a developer. It creates code in C leveraging structs and pointers and arrays, which it compiles into fast code. When you insert data into it, it creates a new instance of a struct in memory, and adds it to an array. When the insert is committed, a small write is made to the transaction to make sure it’s durable, but none of the locking and latching behaviour that typifies transactional systems is needed. Indexes are done using hashes and using bw-trees (which avoid locking through the use of pointers) and by handling each updates as a delete-and-insert. This is data the way that developers do it when they’re coding for performance – the way I was taught at university before I learned about databases. Being done in C, it compiles to very quick code, and although these tables don’t support every feature that regular SQL tables do, this is still an excellent direction that has been taken. @rob_farley

    Read the article

  • 'ia32-libs is not installed' while installing Skype on Ubuntu

    - by Vit Kos
    I downloaded skype from official site, but when installing I get this type of error (Reading database ... 100% (Reading database ... 150271 files and directories currently installed.) Unpacking skype (from .../skype-ubuntu_4.0.0.8-1_amd64.deb) ... dpkg: dependency problems prevent configuration of skype: skype depends on ia32-libs; however: Package ia32-libs is not installed. dpkg: error processing skype (--install): dependency problems - leaving unconfigured Processing triggers for desktop-file-utils ... Processing triggers for bamfdaemon ... Rebuilding /usr/share/applications/bamf.index... Processing triggers for gnome-menus ... Read about that I need to install ia32-libs. Tried to install them like this sudo apt-get install package-name:i386 But it doesn't find it. Any hint? Thx.

    Read the article

  • removing an ssrs instance from a scale-out deployment

    - by Alex Bransky
    If you're like me you had at one time connected one of your Reporting Services instances to a report server database that was already in use by another instance.  This allows the instance to show up in the Scale-out Deployment section of the Reporting Services Configuration Manager.  My problem was that the server that got joined to the original server was no longer available as it had been repurposed, and when I clicked Remove Server to remove it from my scale-out it would fail because it couldn't contact the server.  After searching for a solution for quite some time I decided to look around in the report server database tables, and voila!  All I had to do was remove the old server from the Keys table.  I can't guarantee there won't be any side effects to this method, but it worked like a charm for me.

    Read the article

  • Access Control Service v2: Registering Web Identities in your Applications [concepts]

    - by Your DisplayName here!
    ACS v2 support two fundamental types of client identities– I like to call them “enterprise identities” (WS-*) and “web identities” (Google, LiveID, OpenId in general…). I also see two different “mind sets” when it comes to application design using the above identity types: Enterprise identities – often the fact that a client can present a token from a trusted identity provider means he is a legitimate user of the application. Trust relationships and authorization details have been negotiated out of band (often on paper). Web identities – the fact that a user can authenticate with Google et al does not necessarily mean he is a legitimate (or registered) user of an application. Typically additional steps are necessary (like filling out a form, email confirmation etc). Sometimes also a mixture of both approaches exist, for the sake of this post, I will focus on the web identity case. I got a number of questions how to implement the web identity scenario and after some conversations it turns out it is the old authentication vs. authorization problem that gets in the way. Many people use the IsAuthenticated property on IIdentity to make security decisions in their applications (or deny user=”?” in ASP.NET terms). That’s a very natural thing to do, because authentication was done inside the application and we knew exactly when the IsAuthenticated condition is true. Been there, done that. Guilty ;) The fundamental difference between these “old style” apps and federation is, that authentication is not done by the application anymore. It is done by a third party service, and in the case of web identity providers, in services that are not under our control (nor do we have a formal business relationship with these providers). Now the issue is, when you switch to ACS, and someone with a Google account authenticates, indeed IsAuthenticated is true – because that’s what he is! This does not mean, that he is also authorized to use the application. It just proves he was able to authenticate with Google. Now this obviously leads to confusion. How can we solve that? Easy answer: We have to deal with authentication and authorization separately. Job done ;) For many application types I see this general approach: Application uses ACS for authentication (maybe both enterprise and web identities, we focus on web identities but you could easily have a dual approach here) Application offers to authenticate (or sign in) via web identity accounts like LiveID, Google, Facebook etc. Application also maintains a database of its “own” users. Typically you want to store additional information about the user In such an application type it is important to have a unique identifier for your users (think the primary key of your user database). What would that be? Most web identity provider (and all the standard ACS v2 supported ones) emit a NameIdentifier claim. This is a stable ID for the client (scoped to the relying party – more on that later). Furthermore ACS emits a claims identifying the identity provider (like the original issuer concept in WIF). When you combine these two values together, you can be sure to have a unique identifier for the user, e.g.: Facebook-134952459903700\799880347 You can now check on incoming calls, if the user is already registered and if yes, swap the ACS claims with claims coming from your user database. One claims would maybe be a role like “Registered User” which can then be easily used to do authorization checks in the application. The WIF claims authentication manager is a perfect place to do the claims transformation. If the user is not registered, show a register form. Maybe you can use some claims from the identity provider to pre-fill form fields. (see here where I show how to use the Facebook API to fetch additional user properties). After successful registration (which may include other mechanisms like a confirmation email), flip the bit in your database to make the web identity a registered user. This is all very theoretical. In the next post I will show some code and provide a download link for the complete sample. More on NameIdentifier Identity providers “guarantee” that the name identifier for a given user in your application will always be the same. But different applications (in the case of ACS – different ACS namespaces) will see different name identifiers. This is by design to protect the privacy of users because identical name identifiers could be used to create “profiles” of some sort for that user. In technical terms they create the name identifier approximately like this: name identifier = Hash((Provider Internal User ID) + (Relying Party Address)) Why is this important to know? Well – when you change the name of your ACS namespace, the name identifiers will change as well and you will will lose your “connection” to your existing users. Oh an btw – never use any other claims (like email address or name) to form a unique ID – these can often be changed by users.

    Read the article

  • Get Your Workshop Hands On!

    - by Justin Kestelyn
    Now that 2010 is behind us, that means a fresh set of Developer Day workshops (still free, always free) are ahead of us! Developer Day workshops are free, hands-on workshops that give you the software and skills to tame that learning curve and reach the next level in your technical knowledge. We have a range of entrees on the menu, including Java Development, Database Application Development, Fusion Development (Oracle ADF), and more. Most of these workshops let you walk away with a fully functional, VirtualBox-based software appliance that you can use for continued learning. Here's a short list of workshops for which you can register right now: - Java: Boston, March 8- Database App Development: Dallas, March 9- SOA Development: Reston, March 9- Data Integration: Seattle, March 15 + others planned for Toronto, Philadelphia, Shanghai, Perth, Istanbul, and many other cities in 2011! See this URL for more workshop info as it becomes available.

    Read the article

  • Doubt regarding search engine/plugin(One present on the website itself)

    - by Ravi Gupta
    I am new to web development and trying to study various types of websites as case study. Right now my focus is on how search engines works for an eCommerce website. I know basic functioning for a search engine, i.e. crawl web pages, index them and the display the results using those indexes. But I got little confuse in case of an eCommerce website. Don't you think that it would be better if a search engine instead of crawling the web pages containing products, it should directly crawl the database and index the products stored in the database? And when a user search for any product, it will simply give us the rows of the table which matches the user query? If this is not the case, can someone please explain how the usual method works on eCommerce website?

    Read the article

  • ORACLE UNIVERSITY

    - by mseika
    Expert Seminar in Dubai: Oracle Database Security Audit with Pete Finnigan Oracle University's Expert Seminars are delivered by the best Oracle Gurus in the industry from all over the world. These unique and informative seminars are designed to provide you with expert insight in your area of interest. Pete Finnigan is delivering the Expert Seminar ‘Oracle Database Security Audit’ on 16-17 January 2013 in Dubai. You can find more information here. Please note: Your OPN discount is applied to the standard price shown on the website. For assistance with bookings contact Oracle University: eMail: [email protected] Telephone: +971 4 39 09 050

    Read the article

  • eSTEP Newsletter December 2012

    - by uwes
    Dear Partners,We would like to inform you that the December issue of our Newsletter is now available.The issue contains informations to the following topics: Notes from Corporate: It's Earth day - Every Day, Oracle SPARC Newsletter, Pre-Built Developer VMs (for Oracle VM VirtualBox), Oracle Database Appliance Now Certified by SAP, Database High Availability, Cultivating Business-Led Innovation Technical Corner: Geek Fest! Talking About the Design of the T4 and T5 SPARC Chips, Blog: Is This Your Idea of Disaster Recovery?; Oracle® Practitioner Guide - A Pragmatic Approach to Cloud Adoption; Oracle Practitioner Guide: A pragmatic Approach to Cloud Adoption; Darren Moffat Explains the new ZFS Encryption Features in Solaris 11.1; Command Summary: Basic Operations with the Image Packaging System; SPARC T4 Server Delivers Outstanding Performance on Oracle Business Intelligence Enterprise Edition 11g; SPARC T4-4 Servers Set First World Record on PeopleSoft HCM 9.1 Benchmark; Sun ZFS Appliance Monitor Refresh: Core Factor Table; Remanufactured Systems Program for Sun Systems from Oracle; Reminder: Oracle Premier Support for Systems; Reminder: Oracle Platinum Services Learning & Events: eSTEP Events Schedule; Recently Delivered Techcasts; Webinar: Maximum Availibility with Oracle GoldenGate References: LUKOIL Overseas Holding Optimizes Oil Field Development Projects with Integrated Project Management; United Networks Increases Accounting Flexibility and Boosts System Performance with ERP Applications Upgrade; Ziggo Rapidly Creates Applications That Accelerate Communications-Service Orders l How to ...: The Role of Oracle Solaris Zones and Oracle Linux Containers in a Virtualization Strategy; How to Update to Oracle Solaris 11.1; Using svcbundle to Create Manifests and Profiles in Oracle Solaris 11.1; How to Migrate Your Data to Oracle Solaris 11 Using Shadow Migration; How to Script Oracle Solaris 11.1 Zones for Easy Cloning; How to Script Oracle Solaris 11 Zones Creation for a Network-in-a-Box Configuration; How to Know Whether T4 Crypto Accelerators Are in Use; Fault Handling and Prevention – Part 1; Transforming and Consolidating Web Data with Oracle Database; Looking Under the Hood at Networking in Oracle VM Server for x86; Best Way to Migrate Data from Legacy File System to ZFS in Oracle Solaris 11; Special Year End Article: The Top 10 Strategic CIO Issues For 2013 You find the Newsletter on our portal under eSTEP News ---> Latest Newsletter. You will need to provide your email address and the pin below to get access. Link to the portal is shown below.URL: http://launch.oracle.com/PIN: eSTEP_2011Previous published Newsletters can be found under the Archived Newsletters section and more useful information under the Events, Download and Links tab. Feel free to explore and any feedback is appreciated to help us improve the service and information we deliver.Thanks and best regards,Partner HW Enablement EMEA

    Read the article

  • Utility Objects Series Introduction (but mostly a bit of an update)

    - by drsql
    So, I have been away from blogging about technical stuff for a  long time,  (I haven’t blogged at all since my resolutions blog , and even my Simple Talk “commentary” blog hasn’t had an entry since December!)  Most of this has been due to finishing up my database design book , which I will blog about at least one more time after it ships next month, but now it is time to get back to it certainly in a bit more regularly. For SQL Rally, I have two sessions, a precon on Database Design,...(read more)

    Read the article

  • IncidentsTracker v1.2 Screenshots

    - by samkea
    he IncidentsTracker v1.2 System is a system that was developed to track Incidents happening in any particular country. It is incorporated with a maping component to enable end users search for places where an incident has happened, enter data about it and then produce reports.It's a Winforms software that was developed in a plugin style using C#  with an extensibility pattern/framework. It sits on an SQl Server backend but can also use any other databases prefered. Its Administrator just has to add the path where the database will be and it will autio create the database. This software was orignally developed to help UN Agancies and NGOs in thier work but can also be ustilised by other entities like the police, the human rights organisations, roads authority, etc etc. The development of a newer version(IncidentTracker v2) has been started in silverlight. Screenshot 01: Login. Screenshot 02: View and Search. Screenshot 03: Mapping Component

    Read the article

  • Oracle Fusion Middleware 11g Release 1 Updates (2014/08/14)

    - by Hiro
    Oracle Fusion Middleware 11g Release 1 Media Pack ?????2014/08/14 ???????????????? 1. Oracle Identity Management Microsoft Windows (32-bit)????????????????????????????? Oracle API Gateway ???Linux x86, Linux x86-64, Microsoft Windows (32-bit), Microsoft Windows x64????????????????????????????????? Oracle Identity Manager Connectors 2. Oracle WebLogic Server on Oracle Database Appliance Linux x86, Linux x86-64, Microsoft Windows (32-bit), Microsoft Windows x64??????????????Oracle WebLogic Server on Oracle Database Appliance 2.9????????????? 3. ??? Oracle Fusion Middleware Repository Creation Utility 11g (11.1.1.7.0) Oracle WebCenter Interaction 10.3.3 ?????????????? Oracle Fusion Middleware Repository Creation Utility??????11.1.1.8.0??????????????? ???Oracle WebCenter Interaction??????My Oracle Support???(???????)?????????????????????? ?????

    Read the article

  • How do I install pgAdmin III for postgreSQL 9.2?

    - by Vector
    I have a Windows server that runs postgresql 9.2. I want to hit it using pgAdmin III from my Ubuntu 12.10 workstation box. I installed pgAdmin III from synaptic and also tried direct download from postgreSQL site using software installer. Regardless, I can get only get pgAdmin III for postgresql 9.1. When I run pgAdmin III and point to my server I get an error message telling me that the database is 9.2 and my pgAdmin III is for 9.1, isn't compatible with 9.2. I can access the server itself fine OK from the Ubuntu box - I have Python programs that hit the database with no problems - but I need pgAdmin III for 9.2 running under Ubuntu 12.10. Is it available? Where do I get it?

    Read the article

  • More useful Sql Server Serivce Broker Queries

    - by ChrisD
    SELECT 'Checking Broker Service Status...' IF (select Top 1 is_broker_enabled from sys.databases where name = 'NWMESSAGE')=1     SELECT ' Broker Service IS Enabled'  -- Should return a 1. ELSE     SELECT '** Broker Service IS DISABLED ***' /* If Is_Broker_enabled returns 0, uncomment and run this code ALTER DATABASE NWMESSAGE SET SINGLE_USER WITH ROLLBACK IMMEDIATE GO Alter Database NWMESSAGE Set enable_broker GO ALTER DATABASE NWDataChannel SET MULTI_USER GO */ SELECT 'Checking For Disabled Queues....' -- ensure the queues are enabled --  0 indicates the queue is disabled. Select '** Receive Queue Disabled: '+name from sys.service_queues where is_receive_enabled = 0 --select [name], is_receive_enabled from sys.service_queues; /*If the queue is disabled, to enable it alter queue QUEUENAME with status=on; – replace QUEUENAME with the name of your queue */ -- Get General information about the queues --select * from sys.service_queues -- Get the message counts in each queue SELECT 'Checking Message Count for each Queue...' select q.name, p.rows from sys.objects as o join sys.partitions as p on p.object_id = o.object_id join sys.objects as q on o.parent_object_id = q.object_id join sys.service_queues sq on sq.name = q.name where p.index_id = 1 -- Ensure all the queue activiation sprocs are present SELECT 'Checking for Activation Stored Procedures....' SELECT  '** Missing Procedure:  '+q.name  From sys.service_queues q Where NOT Exists(Select * from sysobjects where xtype='p' and name='activation_'+q.name) and q.activation_procedure is not null DECLARE @sprocs Table (Name Varchar(2000)) Insert into @sprocs Values ('Echo') Insert into @sprocs Values ('HTTP_POST') Insert into @sprocs Values ('InitializeRecipients') Insert into @sprocs Values ('sp_EnableRecipient') Insert into @sprocs Values ('sp_ProcessReceivedMessage') Insert into @sprocs Values ('sp_SendXmlMessage') SELECT 'Checking for required stored procedures...' SELECT  '** Missing Procedure:  '+s.name  From @sprocs s Where NOT Exists(Select * from sysobjects where xtype='p' and name=s.name) GO -- Check the services Select 'Checking Recipient Message Services...' Select '** Missing Message Service:' + r.RecipientName +'MessageService' From Recipient r Where not exists (Select * from sys.services s where  s.name  COLLATE SQL_Latin1_General_CP1_CI_AS= r.RecipientName+'MessageService') DECLARE @svcs Table (Name Varchar(2000)) Insert into @svcs Values ('XmlMessageSendingService') SELECT  '** Missing Service:  '+s.name  From @svcs s Where NOT Exists(Select * from sys.services where name=s.name COLLATE SQL_Latin1_General_CP1_CI_AS) GO /*** To Test a message send Run: sp_SendXmlMessage  'TSQLTEST', 'CommerceEngine','<Root><Text>Test</Text></Root>' */ Select CAST(message_body as XML) as xml, * From XmlMessageSendingQueue /*** clean out all queues declare @handle uniqueidentifier declare conv cursor for   select conversation_handle from sys.conversation_endpoints open conv fetch next from conv into @handle while @@FETCH_STATUS = 0 Begin    END Conversation @handle with cleanup    fetch next from conv into @handle End close conv deallocate conv ***********************

    Read the article

  • MySQL in ASP.NET: Mono using VB.NET

    In a previous tutorial titled ASP.NET Web Development and Hosting published October 25th you learned how to develop ASP.NET websites using Mono Project and deploy them in your existing Linux-Apache hosting account. The example ASP.NET mono website http www.dotnetdevelopment.net did not use a database at the time the tutorial was written. In this part you will learn how to connect and use a MySQL database with your ASP.NET mono project website.... Microsoft Exchange - IT peace of mind Big time solution. Small-stakes price. Get the White Paper now.

    Read the article

  • Building my first ASP.NET WebForms application problem

    - by user1525474
    Hi I have recently started to learn C#/ASP.NET WebForms and after reading two books I thought I was ready to create my first web application. Problem is I could not have been more wrong. Although I am not quite a beginner as a programmer and have done some programming in Java (a Monopoly game), JavaScript (using jQuery), and PHP (create templates for WordPress), I never really created something that is database driven, and I can't seem to figure where to start. I am very confident in my HTML/CSS/jQuery skills, so that is not the problem. My end goal after becoming comfortable in ASP.NET WebForms is to learn MVC, ADO.NET, and the Entity Framework, and start a career as a .NET developer. I would like if someone could tell me some tutorials that build ASP.NET WebForms applications, such as a blog, so I can see what are the steps in creating an ASP.NET WebForms database driven application. I already have to projects in mind for ASP.NET. One is building a blog and the other building a job board.

    Read the article

< Previous Page | 580 581 582 583 584 585 586 587 588 589 590 591  | Next Page >