Search Results

Search found 67075 results on 2683 pages for 'data model'.

Page 6/2683 | < Previous Page | 2 3 4 5 6 7 8 9 10 11 12 13  | Next Page >

  • CIC 2010 - Ghost Stories and Model Based Design

    - by warren.baird
    I was lucky enough to attend the collaboration and interoperability congress recently. The location was very beautiful and interesting, it was held in the mountains about two hours outside Denver, at the Stanley hotel, famous both for inspiring Steven King's novel "The Shining" and for attracting a lot of attention from the "Ghost Hunters" TV show. My visit was prosaic - I didn't get to experience the ghosts the locals promised - but interesting, with some very informative sessions. I noticed one main theme - a lot of people were talking about Model Based Design (MBD), which is moving design and manufacturing away from 2d drawings and towards 3d models. 2d has some pretty deep roots in industrial manufacturing and there have been a lot of challenges encountered in making the leap to 3d. One of the challenges discussed in several sessions was how to get model information out to the non-engineers in the company, which is a topic near and dear to my heart. In the 2D space, people without access to CAD software (for example, people assembling a product on the shop floor) can be given printouts of the design - it's not particularly efficient, and it definitely isn't very green, but it tends to work. There's no direct equivalent in the 3D space. One of the ways that AutoVue is used in industrial manufacturing is to provide non-CAD users with an easy to use, interactive 3D view of their products - in some cases it's directly used by people on the shop floor, but in cases where paper is really ingrained in the process, AutoVue can be used by a technical publications person to create illustrative 2D views that can be printed that show all of the details necessary to complete the work. Are you making the move to model based design? Is AutoVue helping you with your challenges? Let us know in the comments below.

    Read the article

  • Sample domain model for online store

    - by Carel
    We are a group of 4 software development students currently studying at the Cape Peninsula University of Technology. Currently, we are tasked with developing a web application that functions as a online store. We decided to do the back-end in Java while making use of Google Guice for persistence(which is mostly irrelevant for my question). The general idea so far to use PHP to create the website. We decided that we would like to try, after handing in the project, and register a business to actually implement the website. The problem we have been experiencing is with the domain model. These are mostly small issues, however they are starting to impact the schedule of our project. Since we are all young IT students, we have virtually no experience in the business world. As such, we spend quite a significant amount of time planning the domain model in the first place. Now, some of the issues we're picking up is say the reference between the Customer entity and the order entity. Currently, we don't have the customer id in the order entity and we have a list of order entities in the customer entity. Lately, I have wondered if the persistence mechanism will put the client id physically in the order table, even if it's not in the entity? So, I started wondering, if you load a customer object, it will search the entire order table for orders with the customer's id. Now, say you have 10 000 customers and 500 000 orders, won't this take an extremely long time? There are also some business processes that I'm not completely clear on. Finally, my question is: does anyone know of a sample domain model out there that is similar to what we're trying to achieve that will be safe to look at as a reference? I don't want to be accused of stealing anybody's intellectual property, especially since we might implement this as a business.

    Read the article

  • General rule - when to use a model (Codeigniter)

    - by pingu
    Hi guys, I was just curious as to what the rule of thumb was for models. Generally, I use them only for situations where I need to add/edit or update database entries for an object. However, I'm building an app at the moment that has a "config" table which holds various data, such as last updated, which will control when certain features in the app should be displayed. In this instance, I will mostly need to retrieve data from the config table. Is it worth putting these config methods in model? I'm interested to hear how more experienced coders approach the MVC methodology in CI - example pseudo methods (e.g., what methods relating to the same object you'd use in the model and the controller) would be most helpful.

    Read the article

  • Cakephp change model's title without rename model/dbtable

    - by Vincent
    I have a table that store all information about class, but because Class is reserved for class name I had to rename the table from classes to types. But in the view section I need it to be displayed as "Class", including the Paginator links. Anyway to achieve this by adding something in the Type model, without completely customize Paginator and and all view compoenents?

    Read the article

  • Modifying a HTML page to fix several "bugs" add a function to next/previous on a option dropdown

    - by Dennis Sylvian
    SOF, I've got a few problems plaguing me at the moment and am wondering if anyone could assist me with them. I'm trying to get Next Class | Previous Class to act as buttons so that when Next Class is clicked it will go to the next item in the dropdown list and for previous it would go to back one. There used to be a scroll bar that allowed me to scroll the main window left and right, it's missing because (I think it was to do with the scroll left and scroll right function) The footer at the bottom doesn't show correctly on mobile devices; for some reason it appears completely differently to as it does on a computer. The "bar" practically and the Scroll Left and Scroll buttons don't appear at all on mobile devices. The scroll left button is unable to be clicked for some reason, I'm unsure what I've done wrong. Refreshing the page resets the horizontal scroll position to far left (I'm pretty sure this relates to the scroll bar) I want to also find a way so that on mobile devices the the header will not show the placeholder image, however I can't work out what CSS media tag(s) I should be using. Latest: http://jsfiddle.net/pwv7u/ Smaller HTML <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> <head> <meta http-equiv="Content-Type" content="text/html; charset=utf-8" /> <title>DATA DATA DATA DATA DATA DATA DATA DATA</title> <style type="text/css"> <!-- @import url("nstyle.css"); --> </style> <script src="jquery.min.js" type="text/javascript"></script> <script type="text/javascript"> $(document).ready( function() { for (var i=0;i<($("table").children().length);i++){ if(readCookie(i)) $($($("table").children()[i]).children()[(readCookie(i))]).toggleClass('selected').siblings().removeClass('selected'); } $("tr").click(function(){ $(this).toggleClass('selected').siblings().removeClass('selected'); if(readCookie($(this).parent().index())){ if(readCookie($(this).parent().index())==$(this).index()) eraseCookie($(this).parent().index()); else{ eraseCookie($(this).parent().index()); createCookie($(this).parent().index(),$(this).index(),1); } } else createCookie($(this).parent().index(),$(this).index(),1); }); // gather CLASS info var selector = $('.class-selector').on('change', function(){ var id = this.value; if (id!==''){ scrollToAnchor(id); } }); $('a[id^="CLASS"]').each(function(){ var id = this.id, option = $('<option>',{ value: this.id, text:this.id }); selector.append(option); }); function scrollToAnchor(aid) { var aTag = $("a[id='" + aid + "']"); $('html,body').animate({ scrollTop: aTag.offset().top - 80 }, 1); } $("a.TOPJS").click(function () { scrollToAnchor('TOP'); }); $("a.KEYJS").click(function () { scrollToAnchor('KEY'); }); $("a.def").click(function () { $('#container').animate({ "scrollLeft": "-=204" }, 200); }); $("a.abc").click(function () { $("#container").animate({ "scrollLeft": "+=204" }, 200); }); function createCookie(name,value,days) { var expires; if (days) { var date = new Date(); date.setMilliseconds(0); date.setSeconds(0); date.setMinutes(0); date.setHours(0); date.setDate(date.getDate()+days); expires = "; expires="+date.toGMTString(); } else expires = ""; document.cookie = name+"="+value+expires+"; path=/"; } function readCookie(name) { var nameEQ = name + "="; var ca = document.cookie.split(';'); for(var i=0;i < ca.length;i++) { var c = ca[i]; while (c.charAt(0)==' ') c = c.substring(1,c.length); if (c.indexOf(nameEQ) === 0) return c.substring(nameEQ.length,c.length); } return null; } function eraseCookie(name) { createCookie(name,"",-1); } }); </script> </head> <body> <div id="header_container"> <div id="header"> <a href="http://site.x/" target="_blank"><img src="http://placehold.it/300x80"></a> <select class="class-selector"> <option value="">-select class-</option> </select> <div class="classcycler"> <a href="#TOP"><font color=#EFEFEF>Next Class</font></a> <font color=red>|</font> <a href="#TOP"><font color=#EFEFEF>Previous Class</font></a> </div> <div id="header1"> Semi-Transparent Image <a href="#TOP"><font color=#EFEFEF>Up to Top</font></a> | <a href="#KEY"><font color=#EFEFEF>Down to Key</font></a> </div> </div> </div> <a id="TOP"></a> <div id="container"> <table id="gradient-style"> <tbody> <thead> <tr> <th scope="col"><a id="CLASS1"></a>Class 1</th> <th scope="col">Class 1</th> <th scope="col">Class 1</th> <th scope="col">Class<br>Test 1</th> <th scope="col">Class 1</th> <th scope="col">Class 1</th> <th scope="col">Class 1</th> <th scope="col">Class Data 1</th> <th scope="col">Class 1<br>Class 1</th> <th scope="col">Class 1</th> <th scope="col">Class 1<br>Class 1</th> <th scope="col">Class 1</th> <th scope="col">Class 1</th> <th scope="col">Class 1</th> <th scope="col">Class 1</th> <th scope="col">Class 1</th> <th scope="col">Class 1 Class 1</th> <th scope="col">title text<br> data text</th> <th scope="col">title text<br> data text</th> <th scope="col">title text</th> <th scope="col">title text<br> data text</th> <th scope="col">title text<br> data text</th> <th scope="col">title text<br> data text</th> <th scope="col">title text</th> <th scope="col">title text<br> data text</th> <th scope="col">title text<br> data text</th> <th scope="col">title text<br> data text</th> <th scope="col">title text<br> data text</th> <th scope="col">title text<br> (data text)</th> <th scope="col">title text</th> <th scope="col">text</th> <th scope="col">text</th> <th scope="col">title text</th> <th scope="col">title text</th> </tr> </thead> <tr class="ft3"><td>testing data</td><td>testing data</td><td>test</td><td>class b</td><td>test4</td><td><div align="left">data</div></td><td><div align="left"> </div></td><td><div align="left"></div></td><td>testing data</td><td>testing data</td><td>testing data</td><td>testing data</td><td>test</td><td>test</td><td>test</td><td>test</td><td>testing data</td><td>test</td><td>testing data</td><td>testing data</td><td>testing data</td><td>test</td><td>test</td><td>testing data</td><td>testing data</td><td>testing data</td><td>test</td><td>testing data</td><td>test</td><td>testing data</td><td>test</td><td>test</td><td>testing data</td><td>testing data</td><tr> <tr class="f3"><td>test</td><td>test</td><td>test</td><td>class a</td><td>test2</td><td><div align="left"> </div></td><td><div align="left"></div></td><td><div align="left"></div></td><td>testing data</td><td>test</td><td>test</td><td>test</td><td>testing data</td><td>testing data</td><td>test</td><td>testing data</td><td>test</td><td>testing data</td><td>testing data</td><td>test</td><td>testing data</td><td>testing data</td><td>test</td><td>testing data</td><td>testing data</td><td>testing data</td><td>test</td><td>testing data</td><td>test</td><td>test</td><td>test</td><td>test</td><td>testing data</td><td>test</td><tr> <thead> <tr> <th scope="col"><a id="CLASS2"></a>Class 2</th> <th scope="col">Class 2</th> <th scope="col">Class 2</th> <th scope="col">Class<br>Test 2</th> <th scope="col">Class 2</th> <th scope="col">Class 2</th> <th scope="col">Class 2</th> <th scope="col">Class Data 2</th> <th scope="col">Class 2<br>Class 2</th> <th scope="col">Class 2</th> <th scope="col">Class 2<br>Class 2</th> <th scope="col">Class 2</th> <th scope="col">Class 2</th> <th scope="col">Class 2</th> <th scope="col">Class 2</th> <th scope="col">Class 2</th> <th scope="col">Class 2 Class 2</th> <th scope="col">title text<br> data text</th> <th scope="col">title text<br> data text</th> <th scope="col">title text</th> <th scope="col">title text<br> data text</th> <th scope="col">title text<br> data text</th> <th scope="col">title text<br> data text</th> <th scope="col">title text</th> <th scope="col">title text<br> data text</th> <th scope="col">title text<br> data text</th> <th scope="col">title text<br> data text</th> <th scope="col">title text<br> data text</th> <th scope="col">title text<br> (data text)</th> <th scope="col">title text</th> <th scope="col">text</th> <th scope="col">text</th> <th scope="col">title text</th> <th scope="col">title text</th> </tr> </thead> <tr class="ft3"><td>testing data</td><td>testing data</td><td>test</td><td>class f</td><td>test2</td><td><div align="left">data</div></td><td><div align="left"></div></td><td><div align="left">data</div></td><td>test</td><td>test</td><td>testing data</td><td>test</td><td>test</td><td>test</td><td>testing data</td><td>testing data</td><td>testing data</td><td>testing data</td><td>testing data</td><td>test</td><td>testing data</td><td>test</td><td>test</td><td>test</td><td>testing data</td><td>testing data</td><td>test</td><td>test</td><td>test</td><td>testing data</td><td>testing data</td><td>testing data</td><td>testing data</td><td>testing data</td><tr> <tr><td>test</td><td>testing data</td><td>test</td><td>class f</td><td>test4</td><td><div align="left">data</div></td><td><div align="left"></div></td><td><div align="left"></div></td><td>testing data</td><td>test</td><td>test</td><td>test</td><td>testing data</td><td>testing data</td><td>testing data</td><td>testing data</td><td>testing data</td><td>test</td><td>test</td><td>test</td><td>test</td><td>test</td><td>testing data</td><td>test</td><td>testing data</td><td>testing data</td><td>test</td><td>test</td><td>test</td><td>testing data</td><td>test</td><td>testing data</td><td>testing data</td><td>testing data</td><tr> <tr class="f3"><td>test</td><td>testing data</td><td>testing data</td><td>class d</td><td>test5</td><td><div align="left">data</div></td><td><div align="left"> </div></td><td><div align="left">data</div></td><td>test</td><td>test</td><td>test</td><td>test</td><td>test</td><td>testing data</td><td>testing data</td><td>testing data</td><td>testing data</td><td>testing data</td><td>testing data</td><td>testing data</td><td>testing data</td><td>testing data</td><td>testing data</td><td>test</td><td>test</td><td>testing data</td><td>testing data</td><td>testing data</td><td>testing data</td><td>test</td><td>test</td><td>testing data</td><td>testing data</td><td>testing data</td><tr> <tr><td>testing data</td><td>test</td><td>test</td><td>class f</td><td>test5</td><td><div align="left"></div></td><td><div align="left"></div></td><td><div align="left">data</div></td><td>testing data</td><td>test</td><td>testing data</td><td>testing data</td><td>test</td><td>test</td><td>testing data</td><td>test</td><td>test</td><td>testing data</td><td>testing data</td><td>test</td><td>test</td><td>testing data</td><td>test</td><td>test</td><td>test</td><td>test</td><td>testing data</td><td>testing data</td><td>testing data</td><td>test</td><td>test</td><td>testing data</td><td>test</td><td>testing data</td><tr> <tr class="f2"><td>test</td><td>test</td><td>testing data</td><td>class a</td><td>test1</td><td><div align="left">data</div></td><td><div align="left"> </div></td><td><div align="left">data</div></td><td>test</td><td>test</td><td>testing data</td><td>testing data</td><td>test</td><td>testing data</td><td>test</td><td>test</td><td>testing data</td><td>testing data</td><td>test</td><td>testing data</td><td>testing data</td><td>testing data</td><td>testing data</td><td>test</td><td>test</td><td>testing data</td><td>testing data</td><td>testing data</td><td>testing data</td><td>testing data</td><td>test</td><td>testing data</td><td>testing data</td><td>test</td><tr> </tbody> <tfoot> <tr> <th class="alt" colspan="34" scope="col"><a id="KEY"></a><img src="http://placehold.it/300x50"></th> </tr> <tr> <td colspan="34"><em><b>DATA DATA</b> - DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA </em></td> </tr> <tr> <td class="alt" colspan="34"><em><b>DAT DATA</b> - DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA DATA </em></td> </tr> </tfoot> </table> </div> <div id="footer_container"> <div id="footer"> <a href="http://site.x/" target="_blank"><img src="http://placehold.it/300x80"></a> <div class="footleft"> <a class="def" href="javascript: void(0);"><font color="#EFEFEF">Scroll Left</font></a> </div> <div id="footer1"> <font color="darkblue">Semi-Transparent Image</font> <i>Copyright &copy; 2013 <a href="http://site.x/" target="_blank" style="text-decoration: none"><font color=#ADD8E6>site</font></a>.</i> </div> <div id="footer2"> <i>All Rights Reserved.</i> </div> <div class="footright"> <a class="abc" href="javascript: void(0);"><font color="#EFEFEF">Scroll Right</font></a> </div> </div> </div> </body> </html> CSS gradient-style * { white-space: nowrap; } #header .class-selector { top: 10px; left: 20px; position: fixed; } #header .classcycler { top: 45px; left: 20px; position: fixed; font-size:20px; } body { line-height: 1.6em; background-color: #535353; overflow-x: scroll; } #gradient-style { font-family: "Lucida Sans Unicode", "Lucida Grande", Sans-Serif; font-size: 12px; margin: 0px; width: 100%; text-align: center; border-collapse: collapse; } #gradient-style th { font-size: 13px; font-weight: normal; line-height:250%; padding-left: 5px; padding-right: 5px; background: #535353 url('table-images/gradhead.png') repeat-x; border-top: 1px solid #fff; border-bottom: 1px solid #fff; color: #ffffff; } #gradient-style th.alt { font-family: "Times New Roman", Serif; text-align: left; padding: 10px; font-size: 26px; } #gradient-style td { padding-left: 5px; padding-right: 5px; border-bottom: 1px solid #fff; border-left: 1px solid #fff; border-right: 1px solid #fff; color: #00000; border-top: 1px solid #fff; background: #FFF url('table-images/gradback.png') repeat-x; } #gradient-style tr.ft3 td { color: #00000; background: #99cde7 url('table-images/gradoverallstudent.png') repeat-x; font-weight: bold; } #gradient-style tr.f1 td { color: #00000; background: #99cde7 url('table-images/gradbeststudent.png') repeat-x; } #gradient-style tr.f2 td { color: #00000; background: #b7e2b6 url('table-images/gradmostattentedstudent.png') repeat-x; } #gradient-style tr.f3 td { color: #00000; background: #a9cd6c url('table-images/gradleastlatestudtent.png') repeat-x; } #gradient-style tfoot tr td { background: #6FA275; font-size: 12px; color: #000; padding: 10; text-align: left; } #gradient-style tbody tr:hover td, #gradient-style tbody tr.selected td { background: #d0dafd url('table-images/gradhover.png') repeat-x; color: #339; } body { margin: 0; padding: 0; } #header_container { background: #000000 url('table-images/gradhead.png') repeat-x; border: 0px solid #666; height: 80px; left: 0; position: fixed; width: 100%; top: 0; } #header { position: relative; margin: 0 auto; width: 500px; height: 100%; text-align: center; color: #0c0aad; } #header1 { position: absolute; width: 125%; top: 50px; } #container { margin: 0 auto; overflow: auto; padding: 80px 0; width: 100%; } #content { } #footer_container { background: #000000 url('table-images/gradhead.png') repeat-x; border: 0px solid #666; bottom: 0; height: 95px; left: 0; position: fixed; width: 100%; } #footer { position: relative; margin: 0 auto; height: 100%; text-align: center; color: #FFF; } #footer1 { position: absolute; width: 103%; top: 50px; } #footer2 { position: absolute; width: 110%; top: 70px; } #footer .footleft { top: 45px; left: 2%; position: absolute; font-size:20px; } #footer .footright { top: 45px; right: 2%; position: absolute; font-size:20px; }

    Read the article

  • Oracle Communications Data Model

    - by jean-pierre.dijcks
    I've mentioned OCDM in previous posts but found the following (see end of the post) podcast on the topic and figured it is worthwhile to spread the news some more. ORetailDM and OCommunicationsDM are the two data models currently available from Oracle. Both are intended to capture: Business best practices and industry knowledge Pre-built advanced analytics intended to predict future events before they happen (like the Churn model shown below) Oracle technology best practices to ensure optimal performance of the model All of this typically comes with a reduced time to implementation, or as the marketing slogan goes, reduced time to value. Here are the links: Podcast on OCDM OTN pages for OCDM and ORDM

    Read the article

  • Architecting multi-model multi-DB ASP.NET MVC solution

    - by A. Murray
    I have an ASP.NET MVC 4 solution that I'm putting together, leveraging IoC and the repository pattern using Entity Framework 5. I have a new requirement to be able to pull data from a second database (from another internal application) which I don't have control over. There is no API available unfortunately for the second application and the general pattern at my place of work is to go direct to the database. I want to maintain a consistent approach to modeling the domain and use entity framework to pull the data out, so thus far I have used Entity Framework's database first approach to generate a domain model and database context over the top of this. However, I've become a little stuck on how to include the second domain model in the application. I have a generic repository which I've now moved out to a common DataAccess project, but short of creating two distinct wrappers for the generic repository (so each can identify with a specific database context), I'm struggling to see how I can elegantly include multiple models?

    Read the article

  • Telecomunication SID model and resources [on hold]

    - by andygluk
    There is a SID model well-known in telecom industry. Following this model you define resources as resources owned by your enterprise, and then you build resource-oriented services on top of it and then customer-oriented services and so on... So everything is based on enterprise-owned resources, which you have to identify first. What I am looking for and what I am asking is some alternative to this model, build not on enterprise-owned resources, but on resources sell by enterprise. Say, you are selling licenses for using your products. So instead of building model on top of enterprise resources you may be interested to build it on top of licenses you are selling.

    Read the article

  • Data Mappers, Models and Images

    - by James
    Hi, I've seen and read plenty of blog posts and forum topics talking about and giving examples of Data Mapper / Model implementations in PHP, but I've not seen any that also deal with saving files/images. I'm currently working on a Zend Framework based project and I'm doing some image manipulation in the model (which is being passed a file path), and then I'm leaving it to the mapper to save that file to the appropriate location - is this common practise? But then, how do you deal with creating say 3 different size images from the one passed in? At the moment I have a "setImage($path_to_tmp_name)" which checks the image type, resizes and then saves back to the original filename. A call to "getImagePath()" then returns the current file path which the data mapper can use and then change with a call to "setImagePath($path)" once it's saved it to the appropriate location, say "/content/my_images". Does this sound practical to you? Also, how would you deal with getting the URL to that image? Do you see that as being something that the model should be providing? It seems to me like that model should worry about where the images are being stored or ultimately how they're accessed through a browser and so I'm inclined to put that in the ini file and just pass the URL prefix to the view through the controller. Does that sound reasonable? I'm using GD for image manipulation - not that that's of any relevance. UPDATE: I've been wondering if the image resizing should be done in the model at all. The model could require that it's provided a "main" image and a "thumb" image, both of certain dimensions. I've thought about creating a "getImageSpecs()" function in the model that would return something that defines the required sizes, then a separate image manipulation class could carry out the resizing and (perhaps in the controller?) and just pass the final paths in to the model using something like "setImagePaths($images)". Any thoughts much appreciated :) James.

    Read the article

  • Trying to Nullify Django model fields with method where model and fields are parameters

    - by Johnny4000
    I'm trying to write a method like the below where a list of fields (a subset of all the fields) is passed in as a parameter and has their column values set to null. I would be happy of I could get a method with just the fields as a parameter like below, but having the model as a parameter would be even better. from my_project.my_app.models import MyModel def nullify_columns (self, null_fields): field_names = MyModel._meta.get_all_field_names() for field in field_names: if field in null_fields: # The below line does not work because I'm not sure how to # dynamically assign the field name. MyModel.objects.all().update( (MyModel.get_field(field).column) = None) Right now I have something like if 'column1' in list_of_fields: MyModel.objects.all().update(column1 = None) if 'column2' in list_of_fields: MyModel.objects.all().update(column2 = None) etc. which is horrible, but works.

    Read the article

  • Shrinking TCP Window Size to 0 on Cisco ASA

    - by Brent
    Having an issue with any large file transfer that crosses our Cisco ASA unit come to an eventual pause. Setup Test1: Server A, FileZilla Client <- 1GBPS - Cisco ASA <- 1 GBPS - Server B, FileZilla Server TCP Window size on large transfers will drop to 0 after around 30 seconds of a large file transfer. RDP session then becomes unresponsive for a minute or two and then is sporadic. After a minute or two, the FTP transfer resumes, but at 1-2 MB/s. When the FTP transfer is over, the responsiveness of the RDP session returns to normal. Test2: Server C in same network as Server B, FileZilla Client <- local network - Server B, FileZilla Server File will transfer at 30+ MB/s. Details ASA: 5520 running 8.3(1) with ASDM 6.3(1) Windows: Server 2003 R2 SP2 with latest patches Server: VMs running on HP C3000 blade chasis FileZilla: 3.3.5.1, latest stable build Transfer: 20 GB SQL .BAK file Protocol: Active FTP over tcp/20, tcp/21 Switches: Cisco Small Business 2048 Gigabit running latest 2.0.0.8 VMware: 4.1 HP: Flex-10 3.15, latest version Notes All servers are VMs. Thoughts Pretty sure the ASA is at fault since a transfer between VMs on the same network will not show a shrinking Window size. Our ASA is pretty vanilla. No major changes made to any of the settings. It has a bunch of NAT and ACLs. Wireshark Sample No. Time Source Destination Protocol Info 234905 73.916986 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=131981791 Win=65535 Len=0 234906 73.917220 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234907 73.917224 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234908 73.917231 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=131984551 Win=64155 Len=0 234909 73.917463 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234910 73.917467 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234911 73.917469 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234912 73.917476 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=131988691 Win=60015 Len=0 234913 73.917706 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234914 73.917710 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234915 73.917715 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=131991451 Win=57255 Len=0 234916 73.917949 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234917 73.917953 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234918 73.917958 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=131994211 Win=54495 Len=0 234919 73.918193 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234920 73.918197 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234921 73.918202 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=131996971 Win=51735 Len=0 234922 73.918435 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234923 73.918440 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234924 73.918445 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=131999731 Win=48975 Len=0 234925 73.918679 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234926 73.918684 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234927 73.918689 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=132002491 Win=46215 Len=0 234928 73.918922 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234929 73.918927 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234930 73.918932 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=132005251 Win=43455 Len=0 234931 73.919165 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234932 73.919169 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234933 73.919174 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=132008011 Win=40695 Len=0 234934 73.919408 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234935 73.919413 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234936 73.919418 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=132010771 Win=37935 Len=0 234937 73.919652 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234938 73.919656 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234939 73.919661 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=132013531 Win=35175 Len=0 234940 73.919895 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234941 73.919899 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234942 73.919904 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=132016291 Win=32415 Len=0 234943 73.920138 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234944 73.920142 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234945 73.920147 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=132019051 Win=29655 Len=0 234946 73.920381 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234947 73.920386 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234948 73.920391 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=132021811 Win=26895 Len=0 234949 73.920625 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234950 73.920629 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234951 73.920632 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234952 73.920638 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=132025951 Win=22755 Len=0 234953 73.920868 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234954 73.920871 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234955 73.920876 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=132028711 Win=19995 Len=0 234956 73.921111 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234957 73.921115 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234958 73.921120 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=132031471 Win=17235 Len=0 234959 73.921356 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234960 73.921362 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234961 73.921370 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=132034231 Win=14475 Len=0 234962 73.921598 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234963 73.921606 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234964 73.921613 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=132036991 Win=11715 Len=0 234965 73.921841 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234966 73.921848 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234967 73.921855 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=132039751 Win=8955 Len=0 234968 73.922085 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234969 73.922092 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234970 73.922099 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=132042511 Win=6195 Len=0 234971 73.922328 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234972 73.922335 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234973 73.922342 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=132045271 Win=3435 Len=0 234974 73.922571 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234975 73.922579 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234976 73.922586 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=132048031 Win=675 Len=0 234981 75.866453 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 675 bytes 234985 76.020168 1.1.1.1 2.2.2.2 TCP [TCP ZeroWindow] ftp-data ivecon-port [ACK] Seq=1 Ack=132048706 Win=0 Len=0 234989 76.771633 2.2.2.2 1.1.1.1 TCP [TCP ZeroWindowProbe] ivecon-port ftp-data [ACK] Seq=132048706 Ack=1 Win=65535 Len=1 234990 76.771648 1.1.1.1 2.2.2.2 TCP [TCP ZeroWindowProbeAck] [TCP ZeroWindow] ftp-data ivecon-port [ACK] Seq=1 Ack=132048706 Win=0 Len=0 234997 78.279701 2.2.2.2 1.1.1.1 TCP [TCP ZeroWindowProbe] ivecon-port ftp-data [ACK] Seq=132048706 Ack=1 Win=65535 Len=1 234998 78.279714 1.1.1.1 2.2.2.2 TCP [TCP ZeroWindowProbeAck] [TCP ZeroWindow] ftp-data ivecon-port [ACK] Seq=1 Ack=132048706 Win=0 Len=0

    Read the article

  • Shrinking Windows Size to 0 on Cisco ASA

    - by Brent
    Having an issue with any large file transfer that crosses our Cisco ASA unit come to an eventual pause. Setup Test1: Server A, FileZilla Client <- 1GBPS - Cisco ASA <- 1 GBPS - Server B, FileZilla Server TCP Window size on large transfers will drop to 0 after around 30 seconds of a large file transfer. RDP session then becomes unresponsive for a minute or two and then is sporadic. After a minute or two, the FTP transfer resumes, but at 1-2 MB/s. When the FTP transfer is over, the responsiveness of the RDP session returns to normal. Test2: Server C in same network as Server B, FileZilla Client <- local network - Server B, FileZilla Server File will transfer at 30+ MB/s. Details ASA: 5520 running 8.3(1) with ASDM 6.3(1) Windows: Server 2003 R2 SP2 with latest patches Server: VMs running on HP C3000 blade chasis FileZilla: 3.3.5.1, latest stable build Transfer: 20 GB SQL .BAK file Protocol: Active FTP over tcp/20, tcp/21 Switches: Cisco Small Business 2048 Gigabit running latest 2.0.0.8 VMware: 4.1 HP: Flex-10 3.15, latest version Notes All servers are VMs. Thoughts Pretty sure the ASA is at fault since a transfer between VMs on the same network will not show a shrinking Window size. Our ASA is pretty vanilla. No major changes made to any of the settings. It has a bunch of NAT and ACLs. Wireshark Sample No. Time Source Destination Protocol Info 234905 73.916986 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=131981791 Win=65535 Len=0 234906 73.917220 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234907 73.917224 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234908 73.917231 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=131984551 Win=64155 Len=0 234909 73.917463 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234910 73.917467 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234911 73.917469 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234912 73.917476 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=131988691 Win=60015 Len=0 234913 73.917706 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234914 73.917710 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234915 73.917715 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=131991451 Win=57255 Len=0 234916 73.917949 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234917 73.917953 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234918 73.917958 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=131994211 Win=54495 Len=0 234919 73.918193 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234920 73.918197 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234921 73.918202 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=131996971 Win=51735 Len=0 234922 73.918435 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234923 73.918440 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234924 73.918445 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=131999731 Win=48975 Len=0 234925 73.918679 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234926 73.918684 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234927 73.918689 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=132002491 Win=46215 Len=0 234928 73.918922 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234929 73.918927 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234930 73.918932 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=132005251 Win=43455 Len=0 234931 73.919165 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234932 73.919169 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234933 73.919174 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=132008011 Win=40695 Len=0 234934 73.919408 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234935 73.919413 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234936 73.919418 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=132010771 Win=37935 Len=0 234937 73.919652 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234938 73.919656 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234939 73.919661 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=132013531 Win=35175 Len=0 234940 73.919895 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234941 73.919899 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234942 73.919904 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=132016291 Win=32415 Len=0 234943 73.920138 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234944 73.920142 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234945 73.920147 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=132019051 Win=29655 Len=0 234946 73.920381 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234947 73.920386 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234948 73.920391 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=132021811 Win=26895 Len=0 234949 73.920625 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234950 73.920629 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234951 73.920632 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234952 73.920638 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=132025951 Win=22755 Len=0 234953 73.920868 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234954 73.920871 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234955 73.920876 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=132028711 Win=19995 Len=0 234956 73.921111 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234957 73.921115 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234958 73.921120 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=132031471 Win=17235 Len=0 234959 73.921356 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234960 73.921362 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234961 73.921370 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=132034231 Win=14475 Len=0 234962 73.921598 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234963 73.921606 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234964 73.921613 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=132036991 Win=11715 Len=0 234965 73.921841 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234966 73.921848 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234967 73.921855 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=132039751 Win=8955 Len=0 234968 73.922085 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234969 73.922092 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234970 73.922099 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=132042511 Win=6195 Len=0 234971 73.922328 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234972 73.922335 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234973 73.922342 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=132045271 Win=3435 Len=0 234974 73.922571 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234975 73.922579 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 1380 bytes 234976 73.922586 1.1.1.1 2.2.2.2 TCP ftp-data ivecon-port [ACK] Seq=1 Ack=132048031 Win=675 Len=0 234981 75.866453 2.2.2.2 1.1.1.1 FTP-DATA FTP Data: 675 bytes 234985 76.020168 1.1.1.1 2.2.2.2 TCP [TCP ZeroWindow] ftp-data ivecon-port [ACK] Seq=1 Ack=132048706 Win=0 Len=0 234989 76.771633 2.2.2.2 1.1.1.1 TCP [TCP ZeroWindowProbe] ivecon-port ftp-data [ACK] Seq=132048706 Ack=1 Win=65535 Len=1 234990 76.771648 1.1.1.1 2.2.2.2 TCP [TCP ZeroWindowProbeAck] [TCP ZeroWindow] ftp-data ivecon-port [ACK] Seq=1 Ack=132048706 Win=0 Len=0 234997 78.279701 2.2.2.2 1.1.1.1 TCP [TCP ZeroWindowProbe] ivecon-port ftp-data [ACK] Seq=132048706 Ack=1 Win=65535 Len=1 234998 78.279714 1.1.1.1 2.2.2.2 TCP [TCP ZeroWindowProbeAck] [TCP ZeroWindow] ftp-data ivecon-port [ACK] Seq=1 Ack=132048706 Win=0 Len=0

    Read the article

  • ASP.Net Layered app - Share Entity Data Model amongst layers

    - by Chris Klepeis
    How can I share the auto-generated entity data model (generated object classes) amongst all layers of my C# web app whilst only granting query access in the data layer? This uses the typical 3 layer approach: data, business, presentation. My data layer returns an IEnumerable<T> to my business layer, but I cannot return type T to the presentation layer because I do not want the presentation layer to know of the existence of the data layer - which is where the entity framework auto-generated my classes. It was recommended to have a seperate layer with just the data model, but I'm unsure how to seperate the data model from the query functionality the entity framework provides.

    Read the article

  • Create Rails model with argument of associated model?

    - by Kyle Carlson
    I have two models, User and PushupReminder, and a method create_a_reminder in my PushupReminder controller (is that the best place to put it?) that I want to have create a new instance of a PushupReminder for a given user when I pass it a user ID. I have the association via the user_id column working correctly in my PushupReminder table and I've tested that I can both create reminders & send the reminder email correctly via the Rails console. Here is a snippet of the model code: class User < ActiveRecord::Base has_many :pushup_reminders end class PushupReminder < ActiveRecord::Base belongs_to :user end And the create_a_reminder method: def create_a_reminder(user) @user = User.find(user) @reminder = PushupReminder.create(:user_id => @user.id, :completed => false, :num_pushups => @user.pushups_per_reminder, :when_sent => Time.now) PushupReminderMailer.reminder_email(@user).deliver end I'm at a loss for how to run that create_a_reminder method in my code for a given user (eventually will be in a cron job for all my users). If someone could help me get my thinking on the right track, I'd really appreciate it. Thanks!

    Read the article

  • Big Data Appliance X4-2 Release Announcement

    - by Jean-Pierre Dijcks
    Today we are announcing the release of the 3rd generation Big Data Appliance. Read the Press Release here. Software Focus The focus for this 3rd generation of Big Data Appliance is: Comprehensive and Open - Big Data Appliance now includes all Cloudera Software, including Back-up and Disaster Recovery (BDR), Search, Impala, Navigator as well as the previously included components (like CDH, HBase and Cloudera Manager) and Oracle NoSQL Database (CE or EE). Lower TCO then DIY Hadoop Systems Simplified Operations while providing an open platform for the organization Comprehensive security including the new Audit Vault and Database Firewall software, Apache Sentry and Kerberos configured out-of-the-box Hardware Update A good place to start is to quickly review the hardware differences (no price changes!). On a per node basis the following is a comparison between old and new (X3-2) hardware: Big Data Appliance X3-2 Big Data Appliance X4-2 CPU 2 x 8-Core Intel® Xeon® E5-2660 (2.2 GHz) 2 x 8-Core Intel® Xeon® E5-2650 V2 (2.6 GHz) Memory 64GB 64GB Disk 12 x 3TB High Capacity SAS 12 x 4TB High Capacity SAS InfiniBand 40Gb/sec 40Gb/sec Ethernet 10Gb/sec 10Gb/sec For all the details on the environmentals and other useful information, review the data sheet for Big Data Appliance X4-2. The larger disks give BDA X4-2 33% more capacity over the previous generation while adding faster CPUs. Memory for BDA is expandable to 512 GB per node and can be done on a per-node basis, for example for NameNodes or for HBase region servers, or for NoSQL Database nodes. Software Details More details in terms of software and the current versions (note BDA follows a three monthly update cycle for Cloudera and other software): Big Data Appliance 2.2 Software Stack Big Data Appliance 2.3 Software Stack Linux Oracle Linux 5.8 with UEK 1 Oracle Linux 6.4 with UEK 2 JDK JDK 6 JDK 7 Cloudera CDH CDH 4.3 CDH 4.4 Cloudera Manager CM 4.6 CM 4.7 And like we said at the beginning it is important to understand that all other Cloudera components are now included in the price of Oracle Big Data Appliance. They are fully supported by Oracle and available for all BDA customers. For more information: Big Data Appliance Data Sheet Big Data Connectors Data Sheet Oracle NoSQL Database Data Sheet (CE | EE) Oracle Advanced Analytics Data Sheet

    Read the article

  • HTML5 data-* (custom data attribute)

    - by Renso
    Goal: Store custom data with the data attribute on any DOM element and retrieve it. Previously under HTML4 we used to use classes to store custom data, something to the affect of <input class="account void limit-5000 over-4999" /> and then have to parse the data out of the class In a book published by Peter-Paul Koch in 2007, ppk on JavaScript, he explains why and how to use custom attributes to make data more accessible to JavaScript, using name-value pairs. Accessing a custom attribute account-limit=5000 is much easier and more intuitive than trying to parse it out of a class, Plus, what if the class name for example "color-5" has a representative class definition in a CSS stylesheet that hides it away or worse some JavaScript plugin that automatically adds 5000 to it, or something crazy like that, just because it is a valid class name. As you can see there are quite a few reasons why using classes is a bad design and why it was important to define custom data attributes in HTML5. Syntax: You define the data attribute by simply prefixing any data item you want to store with any HTML element with "data-". For example to store our customers account data with a hidden input element: <input type="hidden" data-account="void" data-limit=5000 data-over=4999  /> How to access the data: account  -     element.dataset.account limit    -     element.dataset.limit You can also access it by using the more traditional get/setAttribute method or if using jQuery $('#element').attr('data-account','void') Browser support: All except for IE. There is an IE hack around this at http://gist.github.com/362081. Special Note: Be AWARE, do not use upper-case when defining your data elements as it is all converted to lower-case when reading it, so: data-myAccount="A1234" will not be found when you read it with: element.dataset.myAccount Use only lowercase when reading so this will work: element.dataset.myaccount

    Read the article

  • Review: Data Modeling 101

    I just recently read “Data Modeling 101”by Scott W. Ambler where he gave an overview of fundamental data modeling skills. I think this article was excellent for anyone who was just starting to learn or refresh their skills in regards to the modeling of data.  Scott defines data modeling as the act of exploring data oriented structures.  He goes on to explain about how data models are actually used by defining three different types of models. Types of Data Models Conceptual Data Model  Logical Data Model (LDMs) Physical Data Model(PDMs) He further expands on modeling by exploring common data modeling notations because there are no industry standards for the practice of data modeling. Scott then defines how to actually model data by expanding on entities, attributes, identities, and relationships which are the basic building blocks of data models. In addition he discusses the value of normalization for redundancy and demoralization for performance. Finally, he discuss ways in which Developers and DBAs can become better data modelers through the use of practice, and seeking guidance from more experienced data modelers.

    Read the article

  • Master Data Management – A Foundation for Big Data Analysis

    - by Manouj Tahiliani
    While Master Data Management has crossed the proverbial chasm and is on its way to becoming mainstream, businesses are being hammered by a new megatrend called Big Data. Big Data is characterized by massive volumes, its high frequency, the variety of less structured data sources such as email, sensors, smart meters, social networks, and Weblogs, and the need to analyze vast amounts of data to determine value to improve upon management decisions. Businesses that have embraced MDM to get a single, enriched and unified view of Master data by resolving semantic discrepancies and augmenting the explicit master data information from within the enterprise with implicit data from outside the enterprise like social profiles will have a leg up in embracing Big Data solutions. This is especially true for large and medium-sized businesses in industries like Retail, Communications, Financial Services, etc that would find it very challenging to get comprehensive analytical coverage and derive long-term success without resolving the limitations of the heterogeneous topology that leads to disparate, fragmented and incomplete master data. For analytical success from Big Data or in other words ROI from Big Data Investments, businesses need to acquire, organize and analyze the deluge of data to make better decisions. There will need to be a coexistence of structured and unstructured data and to maintain a tight link between the two to extract maximum insights. MDM is the catalyst that helps maintain that tight linkage by providing an understanding about the identity, characteristics of Persons, Companies, Products, Suppliers, etc. associated with the Big Data and thereby help accelerate ROI. In my next post I will discuss about patterns for co-existing Big Data Solutions and MDM. Feel free to provide comments and thoughts on above as well as Integration or Architectural patterns.

    Read the article

  • VS 2010 SP1 and SQL CE

    - by ScottGu
    Last month we released the Beta of VS 2010 Service Pack 1 (SP1).  You can learn more about the VS 2010 SP1 Beta from Jason Zander’s two blog posts about it, and from Scott Hanselman’s blog post that covers some of the new capabilities enabled with it.   You can download and install the VS 2010 SP1 Beta here. Last week I blogged about the new Visual Studio support for IIS Express that we are adding with VS 2010 SP1. In today’s post I’m going to talk about the new VS 2010 SP1 tooling support for SQL CE, and walkthrough some of the cool scenarios it enables.  SQL CE – What is it and why should you care? SQL CE is a free, embedded, database engine that enables easy database storage. No Database Installation Required SQL CE does not require you to run a setup or install a database server in order to use it.  You can simply copy the SQL CE binaries into the \bin directory of your ASP.NET application, and then your web application can use it as a database engine.  No setup or extra security permissions are required for it to run. You do not need to have an administrator account on the machine. Just copy your web application onto any server and it will work. This is true even of medium-trust applications running in a web hosting environment. SQL CE runs in-memory within your ASP.NET application and will start-up when you first access a SQL CE database, and will automatically shutdown when your application is unloaded.  SQL CE databases are stored as files that live within the \App_Data folder of your ASP.NET Applications. Works with Existing Data APIs SQL CE 4 works with existing .NET-based data APIs, and supports a SQL Server compatible query syntax.  This means you can use existing data APIs like ADO.NET, as well as use higher-level ORMs like Entity Framework and NHibernate with SQL CE.  This enables you to use the same data programming skills and data APIs you know today. Supports Development, Testing and Production Scenarios SQL CE can be used for development scenarios, testing scenarios, and light production usage scenarios.  With the SQL CE 4 release we’ve done the engineering work to ensure that SQL CE won’t crash or deadlock when used in a multi-threaded server scenario (like ASP.NET).  This is a big change from previous releases of SQL CE – which were designed for client-only scenarios and which explicitly blocked running in web-server environments.  Starting with SQL CE 4 you can use it in a web-server as well. There are no license restrictions with SQL CE.  It is also totally free. Easy Migration to SQL Server SQL CE is an embedded database – which makes it ideal for development, testing, and light-usage scenarios.  For high-volume sites and applications you’ll probably want to migrate your database to use SQL Server Express (which is free), SQL Server or SQL Azure.  These servers enable much better scalability, more development features (including features like Stored Procedures – which aren’t supported with SQL CE), as well as more advanced data management capabilities. We’ll ship migration tools that enable you to optionally take SQL CE databases and easily upgrade them to use SQL Server Express, SQL Server, or SQL Azure.  You will not need to change your code when upgrading a SQL CE database to SQL Server or SQL Azure.  Our goal is to enable you to be able to simply change the database connection string in your web.config file and have your application just work. New Tooling Support for SQL CE in VS 2010 SP1 VS 2010 SP1 includes much improved tooling support for SQL CE, and adds support for using SQL CE within ASP.NET projects for the first time.  With VS 2010 SP1 you can now: Create new SQL CE Databases Edit and Modify SQL CE Database Schema and Indexes Populate SQL CE Databases within Data Use the Entity Framework (EF) designer to create model layers against SQL CE databases Use EF Code First to define model layers in code, then create a SQL CE database from them, and optionally edit the DB with VS Deploy SQL CE databases to remote servers using Web Deploy and optionally convert them to full SQL Server databases You can take advantage of all of the above features from within both ASP.NET Web Forms and ASP.NET MVC based projects. Download You can enable SQL CE tooling support within VS 2010 by first installing VS 2010 SP1 (beta). Once SP1 is installed, you’ll also then need to install the SQL CE Tools for Visual Studio download.  This is a separate download that enables the SQL CE tooling support for VS 2010 SP1. Walkthrough of Two Scenarios In this blog post I’m going to walkthrough how you can take advantage of SQL CE and VS 2010 SP1 using both an ASP.NET Web Forms and an ASP.NET MVC based application. Specifically, we’ll walkthrough: How to create a SQL CE database using VS 2010 SP1, then use the EF4 visual designers in Visual Studio to construct a model layer from it, and then display and edit the data using an ASP.NET GridView control. How to use an EF Code First approach to define a model layer using POCO classes and then have EF Code-First “auto-create” a SQL CE database for us based on our model classes.  We’ll then look at how we can use the new VS 2010 SP1 support for SQL CE to inspect the database that was created, populate it with data, and later make schema changes to it.  We’ll do all this within the context of an ASP.NET MVC based application. You can follow the two walkthroughs below on your own machine by installing VS 2010 SP1 (beta) and then installing the SQL CE Tools for Visual Studio download (which is a separate download that enables SQL CE tooling support for VS 2010 SP1). Walkthrough 1: Create a SQL CE Database, Create EF Model Classes, Edit the Data with a GridView This first walkthrough will demonstrate how to create and define a SQL CE database within an ASP.NET Web Form application.  We’ll then build an EF model layer for it and use that model layer to enable data editing scenarios with an <asp:GridView> control. Step 1: Create a new ASP.NET Web Forms Project We’ll begin by using the File->New Project menu command within Visual Studio to create a new ASP.NET Web Forms project.  We’ll use the “ASP.NET Web Application” project template option so that it has a default UI skin implemented: Step 2: Create a SQL CE Database Right click on the “App_Data” folder within the created project and choose the “Add->New Item” menu command: This will bring up the “Add Item” dialog box.  Select the “SQL Server Compact 4.0 Local Database” item (new in VS 2010 SP1) and name the database file to create “Store.sdf”: Note that SQL CE database files have a .sdf filename extension. Place them within the /App_Data folder of your ASP.NET application to enable easy deployment. When we clicked the “Add” button above a Store.sdf file was added to our project: Step 3: Adding a “Products” Table Double-clicking the “Store.sdf” database file will open it up within the Server Explorer tab.  Since it is a new database there are no tables within it: Right click on the “Tables” icon and choose the “Create Table” menu command to create a new database table.  We’ll name the new table “Products” and add 4 columns to it.  We’ll mark the first column as a primary key (and make it an identify column so that its value will automatically increment with each new row): When we click “ok” our new Products table will be created in the SQL CE database. Step 4: Populate with Data Once our Products table is created it will show up within the Server Explorer.  We can right-click it and choose the “Show Table Data” menu command to edit its data: Let’s add a few sample rows of data to it: Step 5: Create an EF Model Layer We have a SQL CE database with some data in it – let’s now create an EF Model Layer that will provide a way for us to easily query and update data within it. Let’s right-click on our project and choose the “Add->New Item” menu command.  This will bring up the “Add New Item” dialog – select the “ADO.NET Entity Data Model” item within it and name it “Store.edmx” This will add a new Store.edmx item to our solution explorer and launch a wizard that allows us to quickly create an EF model: Select the “Generate From Database” option above and click next.  Choose to use the Store.sdf SQL CE database we just created and then click next again.  The wizard will then ask you what database objects you want to import into your model.  Let’s choose to import the “Products” table we created earlier: When we click the “Finish” button Visual Studio will open up the EF designer.  It will have a Product entity already on it that maps to the “Products” table within our SQL CE database: The VS 2010 SP1 EF designer works exactly the same with SQL CE as it does already with SQL Server and SQL Express.  The Product entity above will be persisted as a class (called “Product”) that we can programmatically work against within our ASP.NET application. Step 6: Compile the Project Before using your model layer you’ll need to build your project.  Do a Ctrl+Shift+B to compile the project, or use the Build->Build Solution menu command. Step 7: Create a Page that Uses our EF Model Layer Let’s now create a simple ASP.NET Web Form that contains a GridView control that we can use to display and edit the our Products data (via the EF Model Layer we just created). Right-click on the project and choose the Add->New Item command.  Select the “Web Form from Master Page” item template, and name the page you create “Products.aspx”.  Base the master page on the “Site.Master” template that is in the root of the project. Add an <h2>Products</h2> heading the new Page, and add an <asp:gridview> control within it: Then click the “Design” tab to switch into design-view. Select the GridView control, and then click the top-right corner to display the GridView’s “Smart Tasks” UI: Choose the “New data source…” drop down option above.  This will bring up the below dialog which allows you to pick your Data Source type: Select the “Entity” data source option – which will allow us to easily connect our GridView to the EF model layer we created earlier.  This will bring up another dialog that allows us to pick our model layer: Select the “StoreEntities” option in the dropdown – which is the EF model layer we created earlier.  Then click next – which will allow us to pick which entity within it we want to bind to: Select the “Products” entity in the above dialog – which indicates that we want to bind against the “Product” entity class we defined earlier.  Then click the “Enable automatic updates” checkbox to ensure that we can both query and update Products.  When you click “Finish” VS will wire-up an <asp:EntityDataSource> to your <asp:GridView> control: The last two steps we’ll do will be to click the “Enable Editing” checkbox on the Grid (which will cause the Grid to display an “Edit” link on each row) and (optionally) use the Auto Format dialog to pick a UI template for the Grid. Step 8: Run the Application Let’s now run our application and browse to the /Products.aspx page that contains our GridView.  When we do so we’ll see a Grid UI of the Products within our SQL CE database. Clicking the “Edit” link for any of the rows will allow us to edit their values: When we click “Update” the GridView will post back the values, persist them through our EF Model Layer, and ultimately save them within our SQL CE database. Learn More about using EF with ASP.NET Web Forms Read this tutorial series on the http://asp.net site to learn more about how to use EF with ASP.NET Web Forms.  The tutorial series uses SQL Express as the database – but the nice thing is that all of the same steps/concepts can also now also be done with SQL CE.   Walkthrough 2: Using EF Code-First with SQL CE and ASP.NET MVC 3 We used a database-first approach with the sample above – where we first created the database, and then used the EF designer to create model classes from the database.  In addition to supporting a designer-based development workflow, EF also enables a more code-centric option which we call “code first development”.  Code-First Development enables a pretty sweet development workflow.  It enables you to: Define your model objects by simply writing “plain old classes” with no base classes or visual designer required Use a “convention over configuration” approach that enables database persistence without explicitly configuring anything Optionally override the convention-based persistence and use a fluent code API to fully customize the persistence mapping Optionally auto-create a database based on the model classes you define – allowing you to start from code first I’ve done several blog posts about EF Code First in the past – I really think it is great.  The good news is that it also works very well with SQL CE. The combination of SQL CE, EF Code First, and the new VS tooling support for SQL CE, enables a pretty nice workflow.  Below is a simple example of how you can use them to build a simple ASP.NET MVC 3 application. Step 1: Create a new ASP.NET MVC 3 Project We’ll begin by using the File->New Project menu command within Visual Studio to create a new ASP.NET MVC 3 project.  We’ll use the “Internet Project” template so that it has a default UI skin implemented: Step 2: Use NuGet to Install EFCodeFirst Next we’ll use the NuGet package manager (automatically installed by ASP.NET MVC 3) to add the EFCodeFirst library to our project.  We’ll use the Package Manager command shell to do this.  Bring up the package manager console within Visual Studio by selecting the View->Other Windows->Package Manager Console menu command.  Then type: install-package EFCodeFirst within the package manager console to download the EFCodeFirst library and have it be added to our project: When we enter the above command, the EFCodeFirst library will be downloaded and added to our application: Step 3: Build Some Model Classes Using a “code first” based development workflow, we will create our model classes first (even before we have a database).  We create these model classes by writing code. For this sample, we will right click on the “Models” folder of our project and add the below three classes to our project: The “Dinner” and “RSVP” model classes above are “plain old CLR objects” (aka POCO).  They do not need to derive from any base classes or implement any interfaces, and the properties they expose are standard .NET data-types.  No data persistence attributes or data code has been added to them.   The “NerdDinners” class derives from the DbContext class (which is supplied by EFCodeFirst) and handles the retrieval/persistence of our Dinner and RSVP instances from a database. Step 4: Listing Dinners We’ve written all of the code necessary to implement our model layer for this simple project.  Let’s now expose and implement the URL: /Dinners/Upcoming within our project.  We’ll use it to list upcoming dinners that happen in the future. We’ll do this by right-clicking on our “Controllers” folder and select the “Add->Controller” menu command.  We’ll name the Controller we want to create “DinnersController”.  We’ll then implement an “Upcoming” action method within it that lists upcoming dinners using our model layer above.  We will use a LINQ query to retrieve the data and pass it to a View to render with the code below: We’ll then right-click within our Upcoming method and choose the “Add-View” menu command to create an “Upcoming” view template that displays our dinners.  We’ll use the “empty” template option within the “Add View” dialog and write the below view template using Razor: Step 4: Configure our Project to use a SQL CE Database We have finished writing all of our code – our last step will be to configure a database connection-string to use. We will point our NerdDinners model class to a SQL CE database by adding the below <connectionString> to the web.config file at the top of our project: EF Code First uses a default convention where context classes will look for a connection-string that matches the DbContext class name.  Because we created a “NerdDinners” class earlier, we’ve also named our connectionstring “NerdDinners”.  Above we are configuring our connection-string to use SQL CE as the database, and telling it that our SQL CE database file will live within the \App_Data directory of our ASP.NET project. Step 5: Running our Application Now that we’ve built our application, let’s run it! We’ll browse to the /Dinners/Upcoming URL – doing so will display an empty list of upcoming dinners: You might ask – but where did it query to get the dinners from? We didn’t explicitly create a database?!? One of the cool features that EF Code-First supports is the ability to automatically create a database (based on the schema of our model classes) when the database we point it at doesn’t exist.  Above we configured  EF Code-First to point at a SQL CE database in the \App_Data\ directory of our project.  When we ran our application, EF Code-First saw that the SQL CE database didn’t exist and automatically created it for us. Step 6: Using VS 2010 SP1 to Explore our newly created SQL CE Database Click the “Show all Files” icon within the Solution Explorer and you’ll see the “NerdDinners.sdf” SQL CE database file that was automatically created for us by EF code-first within the \App_Data\ folder: We can optionally right-click on the file and “Include in Project" to add it to our solution: We can also double-click the file (regardless of whether it is added to the project) and VS 2010 SP1 will open it as a database we can edit within the “Server Explorer” tab of the IDE. Below is the view we get when we double-click our NerdDinners.sdf SQL CE file.  We can drill in to see the schema of the Dinners and RSVPs tables in the tree explorer.  Notice how two tables - Dinners and RSVPs – were automatically created for us within our SQL CE database.  This was done by EF Code First when we accessed the NerdDinners class by running our application above: We can right-click on a Table and use the “Show Table Data” command to enter some upcoming dinners in our database: We’ll use the built-in editor that VS 2010 SP1 supports to populate our table data below: And now when we hit “refresh” on the /Dinners/Upcoming URL within our browser we’ll see some upcoming dinners show up: Step 7: Changing our Model and Database Schema Let’s now modify the schema of our model layer and database, and walkthrough one way that the new VS 2010 SP1 Tooling support for SQL CE can make this easier.  With EF Code-First you typically start making database changes by modifying the model classes.  For example, let’s add an additional string property called “UrlLink” to our “Dinner” class.  We’ll use this to point to a link for more information about the event: Now when we re-run our project, and visit the /Dinners/Upcoming URL we’ll see an error thrown: We are seeing this error because EF Code-First automatically created our database, and by default when it does this it adds a table that helps tracks whether the schema of our database is in sync with our model classes.  EF Code-First helpfully throws an error when they become out of sync – making it easier to track down issues at development time that you might otherwise only find (via obscure errors) at runtime.  Note that if you do not want this feature you can turn it off by changing the default conventions of your DbContext class (in this case our NerdDinners class) to not track the schema version. Our model classes and database schema are out of sync in the above example – so how do we fix this?  There are two approaches you can use today: Delete the database and have EF Code First automatically re-create the database based on the new model class schema (losing the data within the existing DB) Modify the schema of the existing database to make it in sync with the model classes (keeping/migrating the data within the existing DB) There are a couple of ways you can do the second approach above.  Below I’m going to show how you can take advantage of the new VS 2010 SP1 Tooling support for SQL CE to use a database schema tool to modify our database structure.  We are also going to be supporting a “migrations” feature with EF in the future that will allow you to automate/script database schema migrations programmatically. Step 8: Modify our SQL CE Database Schema using VS 2010 SP1 The new SQL CE Tooling support within VS 2010 SP1 makes it easy to modify the schema of our existing SQL CE database.  To do this we’ll right-click on our “Dinners” table and choose the “Edit Table Schema” command: This will bring up the below “Edit Table” dialog.  We can rename, change or delete any of the existing columns in our table, or click at the bottom of the column listing and type to add a new column.  Below I’ve added a new “UrlLink” column of type “nvarchar” (since our property is a string): When we click ok our database will be updated to have the new column and our schema will now match our model classes. Because we are manually modifying our database schema, there is one additional step we need to take to let EF Code-First know that the database schema is in sync with our model classes.  As i mentioned earlier, when a database is automatically created by EF Code-First it adds a “EdmMetadata” table to the database to track schema versions (and hash our model classes against them to detect mismatches between our model classes and the database schema): Since we are manually updating and maintaining our database schema, we don’t need this table – and can just delete it: This will leave us with just the two tables that correspond to our model classes: And now when we re-run our /Dinners/Upcoming URL it will display the dinners correctly: One last touch we could do would be to update our view to check for the new UrlLink property and render a <a> link to it if an event has one: And now when we refresh our /Dinners/Upcoming we will see hyperlinks for the events that have a UrlLink stored in the database: Summary SQL CE provides a free, embedded, database engine that you can use to easily enable database storage.  With SQL CE 4 you can now take advantage of it within ASP.NET projects and applications (both Web Forms and MVC). VS 2010 SP1 provides tooling support that enables you to easily create, edit and modify SQL CE databases – as well as use the standard EF designer against them.  This allows you to re-use your existing skills and data knowledge while taking advantage of an embedded database option.  This is useful both for small applications (where you don’t need the scalability of a full SQL Server), as well as for development and testing scenarios – where you want to be able to rapidly develop/test your application without having a full database instance.  SQL CE makes it easy to later migrate your data to a full SQL Server or SQL Azure instance if you want to – without having to change any code in your application.  All we would need to change in the above two scenarios is the <connectionString> value within the web.config file in order to have our code run against a full SQL Server.  This provides the flexibility to scale up your application starting from a small embedded database solution as needed. Hope this helps, Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

  • SQL SERVER – Why Do We Need Data Quality Services – Importance and Significance of Data Quality Services (DQS)

    - by pinaldave
    Databases are awesome.  I’m sure my readers know my opinion about this – I have made SQL Server my life’s work after all!  I love technology and all things computer-related.  Of course, even with my love for technology, I have to admit that it has its limits.  For example, it takes a human brain to notice that data has been input incorrectly.  Computer “brains” might be faster than humans, but human brains are still better at pattern recognition.  For example, a human brain will notice that “300” is a ridiculous age for a human to be, but to a computer it is just a number.  A human will also notice similarities between “P. Dave” and “Pinal Dave,” but this would stump most computers. In a database, these sorts of anomalies are incredibly important.  Databases are often used by multiple people who rely on this data to be true and accurate, so data quality is key.  That is why the improved SQL Server features Master Data Management talks about Data Quality Services.  This service has the ability to recognize and flag anomalies like out of range numbers and similarities between data.  This allows a human brain with its pattern recognition abilities to double-check and ensure that P. Dave is the same as Pinal Dave. A nice feature of Data Quality Services is that once you set the rules for the program to follow, it will not only keep your data organized in the future, but go to the past and “fix up” any data that has already been entered.  It also allows you do combine data from multiple places and it will apply these rules across the board, so that you don’t have any weird issues that crop up when trying to fit a round peg into a square hole. There are two parts of Data Quality Services that help you accomplish all these neat things.  The first part is DQL Server, which you can think of as the hardware component of the system.  It is installed on the side of (it needs to install separately after SQL Server is installed) SQL Server and runs quietly in the background, performing all its cleanup services. DQS Client is the user interface that you can interact with to set the rules and check over your data.  There are three main aspects of Client: knowledge base management, data quality projects and administration.  Knowledge base management is the part of the system that allows you to set the rules, or program the “knowledge base,” so that your database is clean and consistent. Data Quality projects are what run in the background and clean up the data that is already present.  The administration allows you to check out what DQS Client is doing, change rules, and generally oversee the entire process.  The whole process is user-friendly and a pleasure to use.  I highly recommend implementing Data Quality Services in your database. Here are few of my blog posts which are related to Data Quality Services and I encourage you to try this out. SQL SERVER – Installing Data Quality Services (DQS) on SQL Server 2012 SQL SERVER – Step by Step Guide to Beginning Data Quality Services in SQL Server 2012 – Introduction to DQS SQL SERVER – DQS Error – Cannot connect to server – A .NET Framework error occurred during execution of user-defined routine or aggregate “SetDataQualitySessions” – SetDataQualitySessionPhaseTwo SQL SERVER – Configuring Interactive Cleansing Suggestion Min Score for Suggestions in Data Quality Services (DQS) – Sensitivity of Suggestion SQL SERVER – Unable to DELETE Project in Data Quality Projects (DQS) Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: Data Quality Services, DQS

    Read the article

  • Welcome Oracle Data Integration 12c: Simplified, Future-Ready Solutions with Extreme Performance

    - by Irem Radzik
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 The big day for the Oracle Data Integration team has finally arrived! It is my honor to introduce you to Oracle Data Integration 12c. Today we announced the general availability of 12c release for Oracle’s key data integration products: Oracle Data Integrator 12c and Oracle GoldenGate 12c. The new release delivers extreme performance, increase IT productivity, and simplify deployment, while helping IT organizations to keep pace with new data-oriented technology trends including cloud computing, big data analytics, real-time business intelligence. With the 12c release Oracle becomes the new leader in the data integration and replication technologies as no other vendor offers such a complete set of data integration capabilities for pervasive, continuous access to trusted data across Oracle platforms as well as third-party systems and applications. Oracle Data Integration 12c release addresses data-driven organizations’ critical and evolving data integration requirements under 3 key themes: Future-Ready Solutions Extreme Performance Fast Time-to-Value       There are many new features that support these key differentiators for Oracle Data Integrator 12c and for Oracle GoldenGate 12c. In this first 12c blog post, I will highlight only a few:·Future-Ready Solutions to Support Current and Emerging Initiatives: Oracle Data Integration offer robust and reliable solutions for key technology trends including cloud computing, big data analytics, real-time business intelligence and continuous data availability. Via the tight integration with Oracle’s database, middleware, and application offerings Oracle Data Integration will continue to support the new features and capabilities right away as these products evolve and provide advance features. E    Extreme Performance: Both GoldenGate and Data Integrator are known for their high performance. The new release widens the gap even further against competition. Oracle GoldenGate 12c’s Integrated Delivery feature enables higher throughput via a special application programming interface into Oracle Database. As mentioned in the press release, customers already report up to 5X higher performance compared to earlier versions of GoldenGate. Oracle Data Integrator 12c introduces parallelism that significantly increases its performance as well. Fast Time-to-Value via Higher IT Productivity and Simplified Solutions:  Oracle Data Integrator 12c’s new flow-based declarative UI brings superior developer productivity, ease of use, and ultimately fast time to market for end users.  It also gives the ability to seamlessly reuse mapping logic speeds development.Oracle GoldenGate 12c ‘s Integrated Delivery feature automatically optimally tunes the process, saving time while improving performance. This is just a quick glimpse into Oracle Data Integrator 12c and Oracle GoldenGate 12c. On November 12th we will reveal much more about the new release in our video webcast "Introducing 12c for Oracle Data Integration". Our customer and partner speakers, including SolarWorld, BT, Rittman Mead will join us in launching the new release. Please join us at this free event to learn more from our executives about the 12c release, hear our customers’ perspectives on the new features, and ask your questions to our experts in the live Q&A. Also, please continue to follow our blogs, tweets, and Facebook updates as we unveil more about the new features of the latest release. /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}

    Read the article

  • CoreData Model Objects for API

    - by theiOSguy
    I am using CoreData in my application. I want to abstract out all the CoreData related stuff as an API so that the consume can use the API instead of directly using CoreData and its generated model objects. CoreData generates the managed objects model as following @interface Person : NSManagedObject @end I want to define my API for example MyAPI and it has a function called as createPerson:(Person*)p; So the consumer of this createPerson API needs to create a Person data object (like POJO in java world) and invoke this API. But I cannot create Person object using Person *p = [Person alloc] init] because the designated initializer for this Person model created by CoreData does not allow this type of creation. So should I define corresponding user facing data object may be PersonDO and this API should take that instead to carry the data into the API implementation? Is my approach right? Any expert advise if design the API this way is a good design pattern?

    Read the article

  • Big Data – Final Wrap and What Next – Day 21 of 21

    - by Pinal Dave
    In yesterday’s blog post we explored various resources related to learning Big Data and in this blog post we will wrap up this 21 day series on Big Data. I have been exploring various terms and technology related to Big Data this entire month. It was indeed fun to write about Big Data in 21 days but the subject of Big Data is much bigger and larger than someone can cover it in 21 days. My first goal was to write about the basics and I think we have got that one covered pretty well. During this 21 days I have received many questions and answers related to Big Data. I have covered a few of the questions in this series and a few more I will be covering in the next coming months. Now after understanding Big Data basics. I am personally going to do a list of the things next. I thought I will share the same with you as this will give you a good idea how to continue the journey of the Big Data. Build a schedule to read various Apache documentations Watch all Pluralsight Courses Explore HortonWorks Sandbox Start building presentation about Big Data – this is a great way to learn something new Present in User Groups Meetings on Big Data Topics Write more blog posts about Big Data I am going to continue learning about Big Data – I want you to continue learning Big Data. Please leave a comment how you are going to continue learning about Big Data. I will publish all the informative comments on this blog with due credit. I want to end this series with the infographic by UMUC. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Big Data, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • SQL Developer Data Modeler v3.3 Early Adopter: Link Model Objects Across Designs

    - by thatjeffsmith
    The third post in our “What’s New in SQL Developer Data Modeler v3.3” series, SQL Developer Data Modeler now allows you to link objects across models. If you need to catch up on the earlier posts, here are the first two: New and Improved Search Collaborative Design via Excel Today’s post is a very simple and straightforward discussion on how to share objects across models and designs. In previous releases you could easily copy and paste objects between models and designs. Simply select your object, right-click and select ‘Copy’ Once copied, paste it into your other designs and then make changes as required. Once you paste the object, it is no longer associated with the source it was copied from. You are free to make any changes you want in the new location without affecting the source material. And it works the other way as well – make any changes to the source material and the new object is also unaffected. However. What if you want to LINK a model object instead of COPYING it? In version 3.3, you can now do this. Simply drag and drop the object instead of copy and pasting it. Select the object, in this case a relational model table, and drag it to your other model. It’s as simple as it sounds, here’s a little animated GIF to show you what I’m talking about. Drag and drop between models/designs to LINK an object Notes The ‘linked’ object cannot be modified from the destination space Updating the source object will propagate the changes forward to wherever it’s been linked You can drag a linked object to another design, so dragging from A - B and then from B - C will work Linked objects are annotated in the model with a ‘Chain’ bitmap, see below This object has been linked from another design/model and cannot be modified. A very simple feature, but I like the flexibility here. Copy and paste = new independent object. Drag and drop = linked object.

    Read the article

  • Partner Webcast - Focus on Oracle Data Profiling and Data Quality 11g

    - by lukasz.romaszewski(at)oracle.com
    Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin-top:0cm; mso-para-margin-right:0cm; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0cm; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi; mso-ansi-language:RO;} Partner Webcast Focus on Oracle Data Profiling and Data Quality 11g February 24th, 12am  CET   Oracle offers an integrated suite Data Quality software architected to discover and correct today's data quality problems and establish a platform prepared for tomorrow's yet unknown data challenges. Oracle Data Profiling provides data investigation, discovery, and profiling in support of quality, migration, integration, stewardship, and governance initiatives. It includes a broad range of features that expand upon basic profiling, including automated monitoring, business-rule validation, and trend analysis. Oracle Data Quality for Data Integrator provides cleansing, standardization, matching, address validation, location enrichment, and linking functions for global customer data and operational business data. It ensures that data adheres to established standards that are adaptable to fit each organization's specific needs.  Both single - and double - byte data are processed in local languages to provide a unique and centralized view of customers, products and services.   During this in-person briefing, Data Integration Solution Specialists will be providing a technical overview and a walkthrough.   Agenda ·         Oracle Data Integration Strategy overview ·         A focus on Oracle Data Profiling and Oracle Data Quality for Data Integrator: o   Oracle Data Profiling o   Oracle Data Quality for Data Integrator o   Live demoo   Q&A Delivery Format  This FREE online LIVE eSeminar will be delivered over the Web and Conference Call. Registrations   received less than 24hours  prior to start time may not receive confirmation to attend. To register , click here. For any questions please contact [email protected]

    Read the article

< Previous Page | 2 3 4 5 6 7 8 9 10 11 12 13  | Next Page >