Search Results

Search found 85825 results on 3433 pages for 'sql data services'.

Page 78/3433 | < Previous Page | 74 75 76 77 78 79 80 81 82 83 84 85  | Next Page >

  • T-SQL Tuesday #007 and T-SQL Tuesday Has a Logo

    - by Adam Machanic
    This month’s T-SQL Tuesday is hosted by Jorge Segarra, the “SQL Chicken.” The topic is rather open ended: What is your favorite new(ish) SQL Server feature? Love the DACPAC? Can’t wait for PDW? Post about it and tell us why! In other T-SQL Tuesday news, we now have a logo. Those of you who are participating in the event, take notice; the rules have changed. Now that we have a logo we’re simplifying the linkback and subject guidelines a bit. Henceforth you can title your post however you want. It...(read more)

    Read the article

  • Last chance for a day of free SQL Server training at SQL in the City 2012

    SQL Server developers and database administrators have one last chance for a full day of free training and networking at SQL in the City 2012. NEW! Deployment Manager Early Access ReleaseDeploy SQL Server changes and .NET applications fast, frequently, and without fuss, using Deployment Manager, the new tool from Red Gate. Try the Early Access Release to get a 20% discount on Version 1. Download the Early Access Release.

    Read the article

  • Creating a SQL Azure Database Should be Easier

    - by Ken Cox [MVP]
    Every time I try to create a database + tables + data for Windows Azure SQL I get errors.  One of them is 'Filegroup reference and partitioning scheme' is not supported in this version of SQL Server.' It’s partly due to my poor memory (since I’ve succeeded before) and partly due to the failure of tools that should be helping me. For example, when I want to create a script from an existing database on my local workstation, I use SQL Server Management Studio (currently v 11.0.2100.60).  I go to Tasks > Generate Scripts which brings up the nice Generate and Publish Scripts wizard. When I go into the Advanced button, under Script for Server Version, why don’t I see SQL Azure as an option by now? The tool should be sorting this out for me, right? Maybe this is available in SQL Server Data Tools? I haven’t got into that yet. Just merge the functionality with SSMS, please. Anyway, I pick an older version of SQL for the target and still need to tweak it for Azure. For example, I take out all the “[dbo].” stuff. Why is it put there by the wizard? I also have to get rid of "ON [PRIMARY]"  to deal with the error I noted at the top. Yes, there’s information on what a table needs to look like in SQL Azure but the tools should know this so I don’t have to mess with it.

    Read the article

  • SQL Date Comparison

    - by Derek Dieter
    When comparing the datetime datatype in SQL Server, it is important to maintain consistency in order to gaurd against SQL interpreting a date differently than you intend. In at least one occasion I have seen someone specify a short format for a date, like (1/4/08) only to find that SQL interpreted the month as [...]

    Read the article

  • Simple way to create a SQL Server Job Using T-SQL

    Sometimes we have a T-SQL process that we need to run that takes some time to run or we want to run it during idle time on the server. We could create a SQL Agent job manually, but is there any simple way to create a scheduled job? The seven tools in the SQL DBA Bundle support your core SQL Server database administration tasks.Make backups a breeze! Enjoy trouble-free troubleshooting! Make the most of monitoring! Download a free trial now.

    Read the article

  • Stairway to SQL PowerShell Level 4: Objects in SQL PowerShell

    This far, we have learned about installation and setup of the PowerShell environment. You should now have a foundation of SQL Server PowerShell. We now are ready to learn about Objects in SQL PowerShell. Schedule Azure backupsRed Gate’s Cloud Services makes it simple to create and schedule backups of your SQL Azure databases to Azure blob storage or Amazon S3. Try it for free today.

    Read the article

  • T-SQL Tuesday #34: HELP!

    - by merrillaldrich
    I owe my career to the SQL Server community, specifically the Internet SQL Server community, so this month’s T-SQL Tuesday is especially poignant. I changed careers “cold” about eight years ago, and, while I had some educational background in computer science, I had relatively little real-world DBA experience. Someone gave me a shot in the form of an entry level job, for which I am grateful, but I also had to make the argument to him that I would figure out whatever I needed to do to be successful...(read more)

    Read the article

  • what's a good way to synchronize a sql server 2008 database from a 2005 database automatically?

    - by Keith Nicholas
    Ok, the scenario is... two servers, on completely different parts of the internet. The sql 2008 database just needs to get data updates and schema changes. It doesn't need to send anything to the 2005 database. Basically just suck data and schema as efficiently as possible automatically as a scheduled task. The database is quite huge.... but the changes per day are probablly around 20/30 megabytes of data/ I can't run any of the inbuilt replication on the 2005 database. I've had a wee look at the Sync Framework, I think that might do what I want, but seems a bit painful and requires a bit of work to get going. I'm wondering if there is tooling out there to make this easier? or?? not quite sure what my options are.

    Read the article

  • Reporting SQL Vulnerability [migrated]

    - by Ciaran87Bel
    My first post here so i'll hopefully keep it simple. I have just finished building a CMS targeted at a certain industry and built a test site to see how everything works. Anyway I wrote a program to check for sql injection vulnerabilities and the program followed a blog link to an external website. The program discovered that the external site had a massive vulnerability that left it open to practically anyone who could then access every bit of data on their MYSQL Server and run queries etc. The thing is this external site is the brand leader in their industry and do millions upon millions of sales per annum. I have tried contacting them to let them know and even went as far as contacting the company that built their platform but I was pretty much brushed off and haven't heard back from them. Their database would contain the details of hundreds of thousands of customers and all their data. I could easily make myself site admin etc in a few seconds but they won't listen to me even though I have offered to share the vulnerability with them and help in anyway I can. Is there anything else I can do because it is one of the biggest security risks I have ever personally come across. Is there any other steps I should take to report this? Thanks

    Read the article

  • Reporting Services keeps erasing my dataset parameters

    - by Dustin Brooks
    I'm using a web service and every time I change something on the dataset, it erases all my parameters. The weird thing is, I can execute the web service call from the data tab and it prompts for all my parameters, but if I click to edit the data the list is empty or if I try to preview the report it blows up because parameters are missing. Just wondering if anyone else has experienced this and if there is a way to prevent this behavior. Here is a copy of the dataset, not that I think it matters. This has to be the most annoying bug (if its a bug) ever. I can't even execute the dataset from the designer without it erasing my parameter list. When you have about 10 parameters and you are making all kinds of changes to a new report, it becomes very tedious to be constantly re-typing the same list over and over. If anything, studio should at least be able to pre-populate with the parameters the service is asking for. sigh Wheres my stress ball... <Query> <Method Namespace="http://www.abc.com/" Name="TWRPerformanceSummary"/> <SoapAction>http://www.abc.com/TWRPerformanceSummary</SoapAction> <ElementPath IgnoreNamespaces="true"> TWRPerformanceSummaryResponse/TWRPerformanceSummaryResult/diffgram/NewDataSet/table{StockPerc,RiskBudget,Custodian,ProductName,StartValue(decimal),EndValue(decimal),CostBasis(decimal)} </ElementPath> </Query>

    Read the article

  • SQL Server 2008 and .Net 4.0?

    - by JMarsch
    Does anyone konw whether I can load .net 4.0 assemblies from SQL Server 2008? In particular, we are looking at SQL Reporting, with Custom Data Extensions. I have noticed that hte SQL Server 2008 Business Intelligence Studio does not seem to support VS2010 at the moment. If I release my CDE's as .net 4.0 assemblies, will I even be able to load them from within the SQL Server Reporting Server?

    Read the article

  • I'm looking for a reliable way to verify T-SQL stored procedures. Anybody got one?

    - by Cory Larson
    Hi all-- We're upgrading from SQL Server 2005 to 2008. Almost every database in the 2005 instance is set to 2000 compatibility mode, but we're jumping to 2008. Our testing is complete, but what we've learned is that we need to get faster at it. I've discovered some stored procedures that either SELECT data from missing tables or try to ORDER BY columns that don't exist. Wrapping the SQL to create the procedures in SET PARSEONLY ON and trapping errors in a try/catch only catches the invalid columns in the ORDER BYs. It does not find the error with the procedure selecting data from the missing table. SSMS 2008's intellisense, however, DOES find the issue, but I can still go ahead and successfully run the ALTER script for the procedure without it complaining. So, why can I even get away with creating a procedure that fails when it runs? Are there any tools out there that can do better than what I've tried? The first tool I found wasn't very useful: DbValidator from CodeProject, but it finds fewer problems than this script I found on SqlServerCentral, which found the invalid column references. ------------------------------------------------------------------------- -- Check Syntax of Database Objects -- Copyrighted work. Free to use as a tool to check your own code or in -- any software not sold. All other uses require written permission. ------------------------------------------------------------------------- -- Turn on ParseOnly so that we don't actually execute anything. SET PARSEONLY ON GO -- Create a table to iterate through declare @ObjectList table (ID_NUM int NOT NULL IDENTITY (1, 1), OBJ_NAME varchar(255), OBJ_TYPE char(2)) -- Get a list of most of the scriptable objects in the DB. insert into @ObjectList (OBJ_NAME, OBJ_TYPE) SELECT name, type FROM sysobjects WHERE type in ('P', 'FN', 'IF', 'TF', 'TR', 'V') order by type, name -- Var to hold the SQL that we will be syntax checking declare @SQLToCheckSyntaxFor varchar(max) -- Var to hold the name of the object we are currently checking declare @ObjectName varchar(255) -- Var to hold the type of the object we are currently checking declare @ObjectType char(2) -- Var to indicate our current location in iterating through the list of objects declare @IDNum int -- Var to indicate the max number of objects we need to iterate through declare @MaxIDNum int -- Set the inital value and max value select @IDNum = Min(ID_NUM), @MaxIDNum = Max(ID_NUM) from @ObjectList -- Begin iteration while @IDNum <= @MaxIDNum begin -- Load per iteration values here select @ObjectName = OBJ_NAME, @ObjectType = OBJ_TYPE from @ObjectList where ID_NUM = @IDNum -- Get the text of the db Object (ie create script for the sproc) SELECT @SQLToCheckSyntaxFor = OBJECT_DEFINITION(OBJECT_ID(@ObjectName, @ObjectType)) begin try -- Run the create script (remember that PARSEONLY has been turned on) EXECUTE(@SQLToCheckSyntaxFor) end try begin catch -- See if the object name is the same in the script and the catalog (kind of a special error) if (ERROR_PROCEDURE() <> @ObjectName) begin print 'Error in ' + @ObjectName print ' The Name in the script is ' + ERROR_PROCEDURE()+ '. (They don''t match)' end -- If the error is just that this already exists then we don't want to report that. else if (ERROR_MESSAGE() <> 'There is already an object named ''' + ERROR_PROCEDURE() + ''' in the database.') begin -- Report the error that we got. print 'Error in ' + ERROR_PROCEDURE() print ' ERROR TEXT: ' + ERROR_MESSAGE() end end catch -- Setup to iterate to the next item in the table select @IDNum = case when Min(ID_NUM) is NULL then @IDNum + 1 else Min(ID_NUM) end from @ObjectList where ID_NUM > @IDNum end -- Turn the ParseOnly back off. SET PARSEONLY OFF GO Any suggestions?

    Read the article

  • Big GRC: Turning Data into Actionable GRC Intelligence

    - by Jenna Danko
    While it’s no longer headline news that Governments have carried out large scale data-mining programmes aimed at terrorism detection and identifying other patterns of interest across a wide range of digital data sources, the debate over the ethics and justification over this action, will clearly continue for some time to come. What is becoming clear is that these programmes are a framework for the collation and aggregation of massive amounts of unstructured data and from this, the creation of actionable intelligence from analyses that allowed the analysts to explore and extract a variety of patterns and then direct resources. This data included audio and video chats, phone calls, photographs, e-mails, documents, internet searches, social media posts and mobile phone logs and connections. Although Governance, Risk and Compliance (GRC) professionals are not looking at the implementation of such programmes, there are many similar GRC “Big data” challenges to be faced and potential lessons to be learned from these high profile government programmes that can be applied a lot closer to home. For example, how can GRC professionals collect, manage and analyze an enormous and disparate volume of data to create and manage their own actionable intelligence covering hidden signs and patterns of criminal activity, the early or retrospective, violation of regulations/laws/corporate policies and procedures, emerging risks and weakening controls etc. Not exactly the stuff of James Bond to be sure, but it is certainly more applicable to most GRC professional’s day to day challenges. So what is Big Data and how can it benefit the GRC process? Although it often varies, the definition of Big Data largely refers to the following types of data: Traditional Enterprise Data – includes customer information from CRM systems, transactional ERP data, web store transactions, and general ledger data. Machine-Generated /Sensor Data – includes Call Detail Records (“CDR”), weblogs and trading systems data. Social Data – includes customer feedback streams, micro-blogging sites like Twitter, and social media platforms like Facebook. The McKinsey Global Institute estimates that data volume is growing 40% per year, and will grow 44x between 2009 and 2020. But while it’s often the most visible parameter, volume of data is not the only characteristic that matters. In fact, according to sources such as Forrester there are four key characteristics that define big data: Volume. Machine-generated data is produced in much larger quantities than non-traditional data. This is all the data generated by IT systems that power the enterprise. This includes live data from packaged and custom applications – for example, app servers, Web servers, databases, networks, virtual machines, telecom equipment, and much more. Velocity. Social media data streams – while not as massive as machine-generated data – produce a large influx of opinions and relationships valuable to customer relationship management as well as offering early insight into potential reputational risk issues. Even at 140 characters per tweet, the high velocity (or frequency) of Twitter data ensures large volumes (over 8 TB per day) need to be managed. Variety. Traditional data formats tend to be relatively well defined by a data schema and change slowly. In contrast, non-traditional data formats exhibit a dizzying rate of change. Without question, all GRC professionals work in a dynamic environment and as new services, new products, new business lines are added or new marketing campaigns executed for example, new data types are needed to capture the resultant information.  Value. The economic value of data varies significantly. Typically, there is good information hidden amongst a larger body of non-traditional data that GRC professionals can use to add real value to the organisation; the greater challenge is identifying what is valuable and then transforming and extracting that data for analysis and action. For example, customer service calls and emails have millions of useful data points and have long been a source of information to GRC professionals. Those calls and emails are critical in helping GRC professionals better identify hidden patterns and implement new policies that can reduce the amount of customer complaints.   Now on a scale and depth far beyond those in place today, all that unstructured call and email data can be captured, stored and analyzed to reveal the reasons for the contact, perhaps with the aggregated customer results cross referenced against what is being said about the organization or a similar peer organization on social media. The organization can then take positive actions, communicating to the market in advance of issues reaching the press, strengthening controls, adjusting risk profiles, changing policy and procedures and completely minimizing, if not eliminating, complaints and compensation for that specific reason in the future. In this one example of many similar ones, the GRC team(s) has demonstrated real and tangible business value. Big Challenges - Big Opportunities As pointed out by recent Forrester research, high performing companies (those that are growing 15% or more year-on-year compared to their peers) are taking a selective approach to investing in Big Data.  "Tomorrow's winners understand this, and they are making selective investments aimed at specific opportunities with tangible benefits where big data offers a more economical solution to meet a need." (Forrsights Strategy Spotlight: Business Intelligence and Big Data, Q4 2012) As pointed out earlier, with the ever increasing volume of regulatory demands and fines for getting it wrong, limited resource availability and out of date or inadequate GRC systems all contributing to a higher cost of compliance and/or higher risk profile than desired – a big data investment in GRC clearly falls into this category. However, to make the most of big data organizations must evolve both their business and IT procedures, processes, people and infrastructures to handle these new high-volume, high-velocity, high-variety sources of data and be able integrate them with the pre-existing company data to be analyzed. GRC big data clearly allows the organization access to and management over a huge amount of often very sensitive information that although can help create a more risk intelligent organization, also presents numerous data governance challenges, including regulatory compliance and information security. In addition to client and regulatory demands over better information security and data protection the sheer amount of information organizations deal with the need to quickly access, classify, protect and manage that information can quickly become a key issue  from a legal, as well as technical or operational standpoint. However, by making information governance processes a bigger part of everyday operations, organizations can make sure data remains readily available and protected. The Right GRC & Big Data Partnership Becomes Key  The "getting it right first time" mantra used in so many companies remains essential for any GRC team that is sponsoring, helping kick start, or even overseeing a big data project. To make a big data GRC initiative work and get the desired value, partnerships with companies, who have a long history of success in delivering successful GRC solutions as well as being at the very forefront of technology innovation, becomes key. Clearly solutions can be built in-house more cheaply than through vendor, but as has been proven time and time again, when it comes to self built solutions covering AML and Fraud for example, few have able to scale or adapt appropriately to meet the changing regulations or challenges that the GRC teams face on a daily basis. This has led to the creation of GRC silo’s that are causing so many headaches today. The solutions that stand out and should be explored are the ones that can seamlessly merge the traditional world of well-known data, analytics and visualization with the new world of seemingly innumerable data sources, utilizing Big Data technologies to generate new GRC insights right across the enterprise.Ultimately, Big Data is here to stay, and organizations that embrace its potential and outline a viable strategy, as well as understand and build a solid analytical foundation, will be the ones that are well positioned to make the most of it. A Blueprint and Roadmap Service for Big Data Big data adoption is first and foremost a business decision. As such it is essential that your partner can align your strategies, goals, and objectives with an architecture vision and roadmap to accelerate adoption of big data for your environment, as well as establish practical, effective governance that will maintain a well managed environment going forward. Key Activities: While your initiatives will clearly vary, there are some generic starting points the team and organization will need to complete: Clearly define your drivers, strategies, goals, objectives and requirements as it relates to big data Conduct a big data readiness and Information Architecture maturity assessment Develop future state big data architecture, including views across all relevant architecture domains; business, applications, information, and technology Provide initial guidance on big data candidate selection for migrations or implementation Develop a strategic roadmap and implementation plan that reflects a prioritization of initiatives based on business impact and technology dependency, and an incremental integration approach for evolving your current state to the target future state in a manner that represents the least amount of risk and impact of change on the business Provide recommendations for practical, effective Data Governance, Data Quality Management, and Information Lifecycle Management to maintain a well-managed environment Conduct an executive workshop with recommendations and next steps There is little debate that managing risk and data are the two biggest obstacles encountered by financial institutions.  Big data is here to stay and risk management certainly is not going anywhere, and ultimately financial services industry organizations that embrace its potential and outline a viable strategy, as well as understand and build a solid analytical foundation, will be best positioned to make the most of it. Matthew Long is a Financial Crime Specialist for Oracle Financial Services. He can be reached at matthew.long AT oracle.com.

    Read the article

  • Introducing Data Annotations Extensions

    - by srkirkland
    Validation of user input is integral to building a modern web application, and ASP.NET MVC offers us a way to enforce business rules on both the client and server using Model Validation.  The recent release of ASP.NET MVC 3 has improved these offerings on the client side by introducing an unobtrusive validation library built on top of jquery.validation.  Out of the box MVC comes with support for Data Annotations (that is, System.ComponentModel.DataAnnotations) and can be extended to support other frameworks.  Data Annotations Validation is becoming more popular and is being baked in to many other Microsoft offerings, including Entity Framework, though with MVC it only contains four validators: Range, Required, StringLength and Regular Expression.  The Data Annotations Extensions project attempts to augment these validators with additional attributes while maintaining the clean integration Data Annotations provides. A Quick Word About Data Annotations Extensions The Data Annotations Extensions project can be found at http://dataannotationsextensions.org/, and currently provides 11 additional validation attributes (ex: Email, EqualTo, Min/Max) on top of Data Annotations’ original 4.  You can find a current list of the validation attributes on the afore mentioned website. The core library provides server-side validation attributes that can be used in any .NET 4.0 project (no MVC dependency). There is also an easily pluggable client-side validation library which can be used in ASP.NET MVC 3 projects using unobtrusive jquery validation (only MVC3 included javascript files are required). On to the Preview Let’s say you had the following “Customer” domain model (or view model, depending on your project structure) in an MVC 3 project: public class Customer { public string Email { get; set; } public int Age { get; set; } public string ProfilePictureLocation { get; set; } } .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } When it comes time to create/edit this Customer, you will probably have a CustomerController and a simple form that just uses one of the Html.EditorFor() methods that the ASP.NET MVC tooling generates for you (or you can write yourself).  It should look something like this: With no validation, the customer can enter nonsense for an email address, and then can even report their age as a negative number!  With the built-in Data Annotations validation, I could do a bit better by adding a Range to the age, adding a RegularExpression for email (yuck!), and adding some required attributes.  However, I’d still be able to report my age as 10.75 years old, and my profile picture could still be any string.  Let’s use Data Annotations along with this project, Data Annotations Extensions, and see what we can get: public class Customer { [Email] [Required] public string Email { get; set; }   [Integer] [Min(1, ErrorMessage="Unless you are benjamin button you are lying.")] [Required] public int Age { get; set; }   [FileExtensions("png|jpg|jpeg|gif")] public string ProfilePictureLocation { get; set; } } .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } Now let’s try to put in some invalid values and see what happens: That is very nice validation, all done on the client side (will also be validated on the server).  Also, the Customer class validation attributes are very easy to read and understand. Another bonus: Since Data Annotations Extensions can integrate with MVC 3’s unobtrusive validation, no additional scripts are required! Now that we’ve seen our target, let’s take a look at how to get there within a new MVC 3 project. Adding Data Annotations Extensions To Your Project First we will File->New Project and create an ASP.NET MVC 3 project.  I am going to use Razor for these examples, but any view engine can be used in practice.  Now go into the NuGet Extension Manager (right click on references and select add Library Package Reference) and search for “DataAnnotationsExtensions.”  You should see the following two packages: The first package is for server-side validation scenarios, but since we are using MVC 3 and would like comprehensive sever and client validation support, click on the DataAnnotationsExtensions.MVC3 project and then click Install.  This will install the Data Annotations Extensions server and client validation DLLs along with David Ebbo’s web activator (which enables the validation attributes to be registered with MVC 3). Now that Data Annotations Extensions is installed you have all you need to start doing advanced model validation.  If you are already using Data Annotations in your project, just making use of the additional validation attributes will provide client and server validation automatically.  However, assuming you are starting with a blank project I’ll walk you through setting up a controller and model to test with. Creating Your Model In the Models folder, create a new User.cs file with a User class that you can use as a model.  To start with, I’ll use the following class: public class User { public string Email { get; set; } public string Password { get; set; } public string PasswordConfirm { get; set; } public string HomePage { get; set; } public int Age { get; set; } } Next, create a simple controller with at least a Create method, and then a matching Create view (note, you can do all of this via the MVC built-in tooling).  Your files will look something like this: UserController.cs: public class UserController : Controller { public ActionResult Create() { return View(new User()); }   [HttpPost] public ActionResult Create(User user) { if (!ModelState.IsValid) { return View(user); }   return Content("User valid!"); } } .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } Create.cshtml: @model NuGetValidationTester.Models.User   @{ ViewBag.Title = "Create"; }   <h2>Create</h2>   <script src="@Url.Content("~/Scripts/jquery.validate.min.js")" type="text/javascript"></script> <script src="@Url.Content("~/Scripts/jquery.validate.unobtrusive.min.js")" type="text/javascript"></script>   @using (Html.BeginForm()) { @Html.ValidationSummary(true) <fieldset> <legend>User</legend> @Html.EditorForModel() <p> <input type="submit" value="Create" /> </p> </fieldset> } .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } In the Create.cshtml view, note that we are referencing jquery validation and jquery unobtrusive (jquery is referenced in the layout page).  These MVC 3 included scripts are the only ones you need to enjoy both the basic Data Annotations validation as well as the validation additions available in Data Annotations Extensions.  These references are added by default when you use the MVC 3 “Add View” dialog on a modification template type. Now when we go to /User/Create we should see a form for editing a User Since we haven’t yet added any validation attributes, this form is valid as shown (including no password, email and an age of 0).  With the built-in Data Annotations attributes we can make some of the fields required, and we could use a range validator of maybe 1 to 110 on Age (of course we don’t want to leave out supercentenarians) but let’s go further and validate our input comprehensively using Data Annotations Extensions.  The new and improved User.cs model class. { [Required] [Email] public string Email { get; set; }   [Required] public string Password { get; set; }   [Required] [EqualTo("Password")] public string PasswordConfirm { get; set; }   [Url] public string HomePage { get; set; }   [Integer] [Min(1)] public int Age { get; set; } } .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } Now let’s re-run our form and try to use some invalid values: All of the validation errors you see above occurred on the client, without ever even hitting submit.  The validation is also checked on the server, which is a good practice since client validation is easily bypassed. That’s all you need to do to start a new project and include Data Annotations Extensions, and of course you can integrate it into an existing project just as easily. Nitpickers Corner ASP.NET MVC 3 futures defines four new data annotations attributes which this project has as well: CreditCard, Email, Url and EqualTo.  Unfortunately referencing MVC 3 futures necessitates taking an dependency on MVC 3 in your model layer, which may be unadvisable in a multi-tiered project.  Data Annotations Extensions keeps the server and client side libraries separate so using the project’s validation attributes don’t require you to take any additional dependencies in your model layer which still allowing for the rich client validation experience if you are using MVC 3. Custom Error Message and Globalization: Since the Data Annotations Extensions are build on top of Data Annotations, you have the ability to define your own static error messages and even to use resource files for very customizable error messages. Available Validators: Please see the project site at http://dataannotationsextensions.org/ for an up-to-date list of the new validators included in this project.  As of this post, the following validators are available: CreditCard Date Digits Email EqualTo FileExtensions Integer Max Min Numeric Url Conclusion Hopefully I’ve illustrated how easy it is to add server and client validation to your MVC 3 projects, and how to easily you can extend the available validation options to meet real world needs. The Data Annotations Extensions project is fully open source under the BSD license.  Any feedback would be greatly appreciated.  More information than you require, along with links to the source code, is available at http://dataannotationsextensions.org/. Enjoy!

    Read the article

  • #DAX Query Plan in SQL Server 2012 #Tabular

    - by Marco Russo (SQLBI)
    The SQL Server Profiler provides you many information regarding the internal behavior of DAX queries sent to a BISM Tabular model. Similar to MDX, also in DAX there is a Formula Engine (FE) and a Storage Engine (SE). The SE is usually handled by Vertipaq (unless you are using DirectQuery mode) and Vertipaq SE Query classes of events gives you a SQL-like syntax that represents the query sent to the storage engine. Another interesting class of events is the DAX Query Plan , which contains a couple...(read more)

    Read the article

  • Replication with SQL Server 2005 Express Edition and SQL Compact Edition 3.5

    - by Andy Gable
    hi all, I need some information on SQL Server 2005 Express edition. What I want to do is have my central database servin local machine databases IE back office Cental database |------------------- Shop floor Terminal 1 |------------------- Shop Floor Terminal 2 |------------------- Shop Floor Terminal 3 |------------------- Shop Floor Terminal 4 |------------------- Shop Floor Terminal 5 |------------------- Shop Floor Terminal 6 I want is so that Shop floor terminals would PULL down ANY changes to the database as and when they happen (selected changes are needed change would be Add new item / Edit Item info that is used by Shop floor terminal (ie price, description, sale group) Is this possible with SQL 2005? I have the ability to make my own Sync Applciation but I would need to know what to look for in the database that trigers a update Many thanks for any advice you can give Andy

    Read the article

  • SQL Server 2005 jobs running twice in a row - using LiteSpeed

    - by Malnizzle
    Howdy! I have a SQL server (2005) backing up to a network share, who has a group of maintenance plans setup through LiteSpeed to backup different DBs. They were just set up to run two sub plans on different schedules for full/diff backups and did that just fine for a couple of months. Then I added "Clean Up" task to the subplans. Ever since that point, the backup creates another bak right after the first bak job is completed. I removed the clean up item from the subplan, and it still creates two baks when ran. Both the SQL Activity Monitor and the machine's windows application log show just one job being executed. I did this same thing to a couple of other servers backing up to the same location, and they are behaving correctly. Thoughts?

    Read the article

  • SQL Server 2008 Express - "Best" backup solution?

    - by Alexander Nyquist
    Hi! What backup solutions would you recommend when using SQL Server 2008 express? I'm pretty new to SQL Server, but as I'm coming from an MySql background i thought of setting up replication on another computer and just take x-copy backups of that server. But unfortanetly replication is not available in the express edition. The site is heavily accessed, so there has to be no delays och downtime. I'm also thinking of doing a backup twice a day or something. What would you recommend? I have multiple computers I can use, but don't know if that helps me since i'm using the express version. Thanks

    Read the article

  • Migrated SQL Server database suddenly in "Restoring" state

    - by Pete Montgomery
    Edit: This is still a live prob, less than an hour after trying RESTORE ... WITH RECOVERY. I backed up a SQL Server 2005 database and restored it to a new SQL 2008 instance. The restore was quick and successful. Everything was fine for an hour or so. Suddenly, the database is now stuck in "(Restoring...)" state in Management Studio and has a green arrow icon, and my application login is failing! Any advice? :-) Edit: This is a live application. If I delete and try again, the hour or so's data will be lost.

    Read the article

  • SQL CLR not properly enabling

    - by dnolan
    We have a SQL server running SQL 2005 Workgroup 64 bit (9.0.4273), on Windows 2003 server 64 bit. We have run sp_configure and reconfigured the server which indicates that the clr is now enabled. exec sp_configure 'clr enabled', '1' go reconfigure go However, when trying to call CREATE ASSEMBLY the server completely dies on us and we have to do a full reboot of the machine. A little more diagnostic information, even though clr enabled is set to 1 and we have rebooted the full server, running the following statement select * from sys.dm_clr_properties returns directory version state locked CLR version with mscoree which is what it says when the CLR is not enabled on another machine. On a correctly enabled machine (after reboot) this function reads directory C:\Windows\Microsoft.NET\Framework64\v2.0.50727\ version v2.0.50727 state CLR is initialized

    Read the article

  • MSSQL 2005 migration to 2008 Express Edition - Any complications?

    - by FullTrust
    Hey, I've developed an application that uses ASP.NET, Linq-to-SQL and MSSQL 2005. However, I would like to migrate it to MSSQL 2008. I don't have MSSQL 2008, so I was wondering if it's possible for me to detach my 2005 db and attach it within 2008 express edition, to test if it will work on my host's MSSQL 2008 server? I haven't done anything complicated (CRUD is done from Linq to SQL, and all stored procs are the ASP.NET Membership default ones). Would this work, or will I get an error since I'm 'downgrading' so to speak? If I download MSSQL 2008 express edition, it will be on the same system as my MSSQL 2005 Developer Edition. I'm hoping this won't cause any problems? Thanks

    Read the article

  • SQL Server Windows Auth Login not working

    - by Mr Shoubs
    I've had someone set up a domain controller on windows 2008 on one server, and sql server 2008 on another. The domain seems to be working fine, I'm logged on as a domain user on both servers, nothing seems to be a problem there. However, when I try to add a domain user/group to SQL Server Security (e.g. clicking ok from the create login screen) it says it can't find it (even though I've used the search to find the correct account in the first place), when I try to logon (even though I haven't added it yet) it says something about the account being part of an untrusted domain instead of saying I don't have permission to log on. Anyone have any ideas on what is set up incorrectly?

    Read the article

  • Simple SQL Server 2005 Replication - "D-1" server used for heavy queries/reports

    - by Ricardo Pardini
    Hello. We have two SQL 2005 machines. One is used for production data, and the other is used for running queries/reports. Every night, the production machine dumps (backups) it's database to disk, and the other one restores it. This is called the D-1 process. I think there must be a more efficient way of doing this, since SQL 2005 has many forms of replication. Some requirements: 1) No need for instant replication, there can be (some) delay 2) All changes (including schemas, data, constraints, indexes) need to be replicated without manual intervention 3) It is used for a single database only 4) There is a third server available if needed 5) There is high bandwidth (gigabit ethernet) available between the servers 6) There isn't a shared storage (SAN) available What would be a good alternative to this daily backup/restore routine? Thanks!

    Read the article

< Previous Page | 74 75 76 77 78 79 80 81 82 83 84 85  | Next Page >