Search Results

Search found 109760 results on 4391 pages for 'ado net entity data model'.

Page 70/4391 | < Previous Page | 66 67 68 69 70 71 72 73 74 75 76 77  | Next Page >

  • Why are data structures so important in interviews?

    - by Vamsi Emani
    I am a newbie into the corporate world recently graduated in computers. I am a java/groovy developer. I am a quick learner and I can learn new frameworks, APIs or even programming languages within considerably short amount of time. Albeit that, I must confess that I was not so strong in data structures when I graduated out of college. Through out the campus placements during my graduation, I've witnessed that most of the biggie tech companies like Amazon, Microsoft etc focused mainly on data structures. It appears as if data structures is the only thing that they expect from a graduate. Adding to this, I see that there is this general perspective that a good programmer is necessarily a one with good knowledge about data structures. To be honest, I felt bad about that. I write good code. I follow standard design patterns of coding, I do use data structures but at the superficial level as in java exposed APIs like ArrayLists, LinkedLists etc. But the companies usually focused on the intricate aspects of Data Structures like pointer based memory manipulation and time complexities. Probably because of my java-ish background, Back then, I understood code efficiency and logic only when talked in terms of Object Oriented Programming like Objects, instances, etc but I never drilled down into the level of bits and bytes. I did not want people to look down upon me for this knowledge deficit of mine in Data Structures. So really why all this emphasis on Data Structures? Does, Not having knowledge in Data Structures really effect one's career in programming? Or is the knowledge in this subject really a sufficient basis to differentiate a good and a bad programmer?

    Read the article

  • Data structure for pattern matching.

    - by alvonellos
    Let's say you have an input file with many entries like these: date, ticker, open, high, low, close, <and some other values> And you want to execute a pattern matching routine on the entries(rows) in that file, using a candlestick pattern, for example. (See, Doji) And that pattern can appear on any uniform time interval (let t = 1s, 5s, 10s, 1d, 7d, 2w, 2y, and so on...). Say a pattern matching routine can take an arbitrary number of rows to perform an analysis and contain an arbitrary number of subpatterns. In other words, some patterns may require 4 entries to operate on. Say also that the routine (may) later have to find and classify extrema (local and global maxima and minima as well as inflection points) for the ticker over a closed interval, for example, you could say that a cubic function (x^3) has the extrema on the interval [-1, 1]. (See link) What would be the most natural choice in terms of a data structure? What about an interface that conforms a Ticker object containing one row of data to a collection of Ticker so that an arbitrary pattern can be applied to the data. What's the first thing that comes to mind? I chose a doubly-linked circular linked list that has the following methods: push_front() push_back() pop_front() pop_back() [] //overloaded, can be used with negative parameters But that data structure seems very clumsy, since so much pushing and popping is going on, I have to make a deep copy of the data structure before running an analysis on it. So, I don't know if I made my question very clear -- but the main points are: What kind of data structures should be considered when analyzing sequential data points to conform to a pattern that does NOT require random access? What kind of data structures should be considered when classifying extrema of a set of data points?

    Read the article

  • How to manage security of these self hosted web apis, to ensure that the request coming for accessing data is authenticated?

    - by Husrat Mehmood
    Let's pretend I am going to work on an enterprise application. Say I have 11 modules in the application and I would have to develop Dashboards for every role in the organization for whom I are going to develop application. We Decided to use Asp.Net Web Api and return json data from our apis. We are going to include 11 Self hosted web apis projects in our application (one self hosted web api) for every module. All 11 modules are connected to one Sql server 2012 Database. Then once api is ready we would have to create Business Dashboards (Based upon roles in Organization). So Now my web api client is Asp.Net Mvc application.Asp.Net mvc will consume those web apis. Here is the part for whom all explanation is done. How should I manage Security of all 11 self hosted web apis? How should I only authenticated request is coming? If I authenticate user by login and password and then redirect user to appropriate Dashboard designed for the role that user have and load data by consuming web apis. How should I ensure that the request coming for accessing data is authenticated?

    Read the article

  • .NET Regex: Howto extract IPv6 address parts

    - by Quandary
    Question: How does the .NET regex string to extract IPv6 addresses look like ? I can get it to extract a simple IPv6 address like "1050:0:0:0:5:600:300c:326b" but not the colon format ("ff06::c3"); My problem is, it should extract a 0 for every omitted value between the :: How do I do that? Below my code + description. Specify IPv6 addresses by omitting leading zeros. For example, IPv6 address 1050:0000:0000:0000:0005:0600:300c:326b may be written as 1050:0:0:0:5:600:300c:326b. Double colon Specify IPv6 addresses by using double colons (::) in place of a series of zeros. For example, IPv6 address ff06:0:0:0:0:0:0:c3 may be written as ff06::c3. Double colons may be used only once in an IP address. strInputString = "ff06::c3"; strInputString = "1050:0000:0000:0000:0005:0600:300c:326b"; string strPattern = "([A-Fa-f0-9]{1,4}:){7}([A-Fa-f0-9]{1,4})"; //strPattern = @"\A(?:[0-9a-fA-F]{1,4}:){7}[0-9a-fA-F]{1,4}\z"; //strPattern = @"(\A([0-9a-f]{1,4}:){1,1}(:[0-9a-f]{1,4}){1,6}\Z)|(\A([0-9a-f]{1,4}:){1,2}(:[0-9a-f]{1,4}){1,5}\Z)|(\A([0-9a-f]{1,4}:){1,3}(:[0-9a-f]{1,4}){1,4}\Z)|(\A([0-9a-f]{1,4}:){1,4}(:[0-9a-f]{1,4}){1,3}\Z)|(\A([0-9a-f]{1,4}:){1,5}(:[0-9a-f]{1,4}){1,2}\Z)|(\A([0-9a-f]{1,4}:){1,6}(:[0-9a-f]{1,4}){1,1}\Z)|(\A(([0-9a-f]{1,4}:){1,7}|:):\Z)|(\A:(:[0-9a-f]{1,4}){1,7}\Z)|(\A((([0-9a-f]{1,4}:){6})(25[0-5]|2[0-4]\d|[0-1]?\d?\d)(\.(25[0-5]|2[0-4]\d|[0-1]?\d?\d)){3})\Z)|(\A(([0-9a-f]{1,4}:){5}[0-9a-f]{1,4}:(25[0-5]|2[0-4]\d|[0-1]?\d?\d)(\.(25[0-5]|2[0-4]\d|[0-1]?\d?\d)){3})\Z)|(\A([0-9a-f]{1,4}:){5}:[0-9a-f]{1,4}:(25[0-5]|2[0-4]\d|[0-1]?\d?\d)(\.(25[0-5]|2[0-4]\d|[0-1]?\d?\d)){3}\Z)|(\A([0-9a-f]{1,4}:){1,1}(:[0-9a-f]{1,4}){1,4}:(25[0-5]|2[0-4]\d|[0-1]?\d?\d)(\.(25[0-5]|2[0-4]\d|[0-1]?\d?\d)){3}\Z)|(\A([0-9a-f]{1,4}:){1,2}(:[0-9a-f]{1,4}){1,3}:(25[0-5]|2[0-4]\d|[0-1]?\d?\d)(\.(25[0-5]|2[0-4]\d|[0-1]?\d?\d)){3}\Z)|(\A([0-9a-f]{1,4}:){1,3}(:[0-9a-f]{1,4}){1,2}:(25[0-5]|2[0-4]\d|[0-1]?\d?\d)(\.(25[0-5]|2[0-4]\d|[0-1]?\d?\d)){3}\Z)|(\A([0-9a-f]{1,4}:){1,4}(:[0-9a-f]{1,4}){1,1}:(25[0-5]|2[0-4]\d|[0-1]?\d?\d)(\.(25[0-5]|2[0-4]\d|[0-1]?\d?\d)){3}\Z)|(\A(([0-9a-f]{1,4}:){1,5}|:):(25[0-5]|2[0-4]\d|[0-1]?\d?\d)(\.(25[0-5]|2[0-4]\d|[0-1]?\d?\d)){3}\Z)|(\A:(:[0-9a-f]{1,4}){1,5}:(25[0-5]|2[0-4]\d|[0-1]?\d?\d)(\.(25[0-5]|2[0-4]\d|[0-1]?\d?\d)){3}\Z) "; //strPattern = @"/^\s*((([0-9A-Fa-f]{1,4}:){7}([0-9A-Fa-f]{1,4}|:))|(([0-9A-Fa-f]{1,4}:){6}(:[0-9A-Fa-f]{1,4}|((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3})|:))|(([0-9A-Fa-f]{1,4}:){5}(((:[0-9A-Fa-f]{1,4}){1,2})|:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3})|:))|(([0-9A-Fa-f]{1,4}:){4}(((:[0-9A-Fa-f]{1,4}){1,3})|((:[0-9A-Fa-f]{1,4})?:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){3}(((:[0-9A-Fa-f]{1,4}){1,4})|((:[0-9A-Fa-f]{1,4}){0,2}:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){2}(((:[0-9A-Fa-f]{1,4}){1,5})|((:[0-9A-Fa-f]{1,4}){0,3}:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){1}(((:[0-9A-Fa-f]{1,4}){1,6})|((:[0-9A-Fa-f]{1,4}){0,4}:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(:(((:[0-9A-Fa-f]{1,4}){1,7})|((:[0-9A-Fa-f]{1,4}){0,5}:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:)))(%.+)?\s*$/"; //strPattern = @"(:?[0-9a-fA-F]{1,4}:){7}([0-9a-fA-F]{1,4})\z"; //strPattern = @"\A((?:[0-9A-Fa-f]{1,4}(?::[0-9A-Fa-f]{1,4})*)?)::((?:[0-9A-Fa-f]{1,4}(?::[0-9A-Fa-f]{1,4})*)?)\z"; //strPattern = @"\A((?:[0-9A-Fa-f]{1,4}(?::[0-9A-Fa-f]{1,4})*)?)::((?:[0-9A-Fa-f]{1,4}:)*)(25[0-5]|2[0-4]\d|[0-1]?\d?\d)(\.(25[0-5]|2[0-4]\d|[0-1]?\d?\d)){3}\z"; //strPattern = @"/^(?:(?:(?:(?:[a-f0-9]{1,4}(?::[a-f0-9]{1,4}){7})|(?:(?!(?:.*[a-f0-9](?::|$)){7,})(?:[a-f0-9]{1,4}(?::[a-f0-9]{1,4}){0,5})?::(?:[a-f0-9]{1,4}(?::[a-f0-9]{1,4}){0,5})?)))|(?:(?:(?:[a-f0-9]{1,4}(?::[a-f0-9]{1,4}){5}:)|(?:(?!(?:.*[a-f0-9]:){5,})(?:[a-f0-9]{1,4}(?::[a-f0-9]{1,4}){0,3})?::(?:[a-f0-9]{1,4}(?::[a-f0-9]{1,4}){0,3}:)?))?(?:(?:25[0-5])|(?:2[0-4][0-9])|(?:1[0-9]{2})|(?:[1-9]?[0-9]))(?:\.(?:(?:25[0-5])|(?:2[0-4][0-9])|(?:1[0-9]{2})|(?:[1-9]?[0-9]))){3}))$/i"; System.Text.RegularExpressions.Regex reValidationRule = new System.Text.RegularExpressions.Regex("^" + strPattern + "$"); if (reValidationRule.Match(strInputString).Success) // If matching pattern { System.Text.RegularExpressions.Match maResult = System.Text.RegularExpressions.Regex.Match(strInputString, strPattern); // Console.WriteLine(maResult.Groups.Count) string[] astrReturnValues = new string[4]; System.Text.RegularExpressions.GroupCollection gc = maResult.Groups; System.Text.RegularExpressions.CaptureCollection cc; int counter; //System.Web.Script.Serialization.JavaScriptSerializer jssJSONserializer = new System.Web.Script.Serialization.JavaScriptSerializer(); //Console.WriteLine(jssJSONserializer.Serialize()); // Loop through each group. for (int i = 0; i < gc.Count; i++) { Console.WriteLine("Group: {0}", i); cc = gc[i].Captures; counter = cc.Count; // Print number of captures in this group. Console.WriteLine("Captures count = " + counter.ToString()); // Loop through each capture in group. for (int ii = 0; ii < counter; ii++) { Console.WriteLine("Capture: {0}", ii); // Print capture and position. Console.WriteLine(cc[ii] + " Starts at character " + cc[ii].Index); } }

    Read the article

  • F# in ASP.NET, mathematics and testing

    - by DigiMortal
    Starting from Visual Studio 2010 F# is full member of .NET Framework languages family. It is functional language with syntax specific to functional languages but I think it is time for us also notice and study functional languages. In this posting I will show you some examples about cool things other people have done using F#. F# and ASP.NET As I am ASP/ASP.NET MVP I am – of course – interested in how people use different languages and technologies with ASP.NET. C# MVP Tomáš Petrícek writes about developing ASP.NET MVC applications using F#. He also shows how to use LINQ To SQL in F# (using F# PowerPack) and provides sample solution and Visual Studio 2010 template for F# MVC web applications. You may also find interesting how you can create controllers in F#. Excellent work, Tomáš! Vladimir Matveev has interesting example about how to use F# and ApplicationHost class to process ASP.NET requests ouside of IIS. This is simple and very straight-forward example and I strongly suggest you to take a look at it. Very cool example is project Strom in Codeplex. Storm is web services testing tool that is fully written on F#. Take a look at this site because Codeplex offers also source code besides binaries. Math Functional languages are strong in fields like mathematics and physics. When I wrote my C# example about BigInteger class I found out that recursive version of Fibonacci algorithm in C# is not performing well. In same time I made same experiment on F# and in F# there were no performance problems with recursive version. You can find F# version of Fibonacci algorithm from Bob Palmer’s blog posting Fibonacci numbers in F#. Although golden spiral is useful for solving many problems I looked for some practical code example and found one. Kean Walmsley published in his Through the Interface blog very interesting posting Creating Fibonacci spirals in AutoCAD using F#. There are also other cool examples you may be interested in. Using numerical components by Extreme Optimization  it is possible to make some numerical integration (quadrature method) using F# (also C# example is available). fsharp.it introduces factorials calculation on F#. Robert Pickering has made very good work on programming The Game of Life in Silverlight and F# – I definitely suggest you to try out this example as it is very illustrative too. Who wants something more complex may take a look at Newton basin fractal example in F# by Jonathan Birge. Testing After some searching and surfing I found out that there is almost everything available for F# to write tests and test your F# code. FsCheck - FsCheck is a port of Haskell's QuickCheck. Important parts of the manual for using FsCheck is almost literally "adapted" from the QuickCheck manual and paper. Any errors and omissions are entirely my responsibility. FsTest - This project is designed to Language Oriented Programming constructs around unit testing and behavior testing in F#. The goal of this project is to create a Domain Specific Language for testing F# code in a way that makes sense for functional programming. FsUnit - FsUnit makes unit-testing with F# more enjoyable. It adds a special syntax to your favorite .NET testing framework. xUnit.NET - xUnit.net is a developer testing framework, built to support Test Driven Development, with a design goal of extreme simplicity and alignment with framework features. It is compatible with .NET Framework 2.0 and later, and offers several runners: console, GUI, MSBuild, and Visual Studio integration via TestDriven.net, CodeRush Test Runner and Resharper. It also offers test project integration for ASP.NET MVC. Getting started Well, as a first thing you need Visual Studio 2010. Then take a look at these resources: F# samples @ MSDN Microsoft F# Developer Center @ MSDN F# Language Reference @ MSDN F# blog F# forums Real World Functional Programming: With Examples in F# and C# (Amazon) Happy F#-ing! :)

    Read the article

  • Is ASP.NET MVC completely (and exclusively) based on conventions?

    - by Mike Valeriano
    --TL;DR Is there a "Hello World!" ASP.NET MVC tutorial out there that doesn't rely on conventions and "stock" projects? Is it even possible to take advantage of the technology without reusing the default file structure, and start from a single "hello_world.asp" file or something (like in PHP)? Am I completely mistaken and I should be looking somewhere else, maybe this? I'm interested in the MVC framework, not Web Forms --Background I've played a bit with PHP in the past, just for fun, and now I'm back to it since web development became relevant for me once again. I'm no professional, but I try to gain as much knowledge and control over the technology I'm working with as possible. I'm using Visual Studio 2012 for C# - my "desktop" language of choice - and since I got the Professional Edition from Dreamspark, the Web Development Tools are available, including ASP.NET MVC 4. I won't touch Web Forms, but the MVC Framework got my attention because the MVC pattern is something I can really relate to, since it provides the control I want but... not quite. Learning PHP was easy - and right form the start I could just create a "hello_world.php" file and just do something like this for immediate results: <!-- file: hello_world.php --> <?php> echo "Hello World!"; <?> But I couldn't find a single ASP.NET (MVC) tutorial out there (I'll be sure to buy one of the upcoming MVC 4 books, only a month away or so) that would start like that. They all start with a sample project, building up knowledge from the basics and heavily using conventions as they go along. Which is fine, I suppose, but it's now the best way for me to learn things. Even the "Empty" project template for a new ASP.NET MVC 4 Application in VS2012 is not empty at all: several files and folders are created for you - much like a new C# desktop application project, but with C# I can in fact start from scratch, creating the project structure myself. It is not the case with PHP: I can choose from a plethora of different MVC frameworks I can just create my own framework I can just skip frameworks altogether, and toss random PHP along with my HTML on a single file and make it work I understand the framework needs to establish some rules, but what if I just want to create a single page website with some C# logic behind it? Do I really need to create a whole bloat of files and folders for the sake of convention? Also, please understand that I haven't gotten far on any of those tutorials mainly because of this reason, but, if that's the only way to do it, I'll go for it using one of the books I've mentioned before. This is my first contact with ASP.NET but from the few comparisons I've read, I believe I should stay the hell away from Web Forms. Thank you. (Please forgive the broken English - it is not my primary language.)

    Read the article

  • Five Key Strategies in Master Data Management

    - by david.butler(at)oracle.com
    Here is a very interesting Profit Magazine article on MDM: A recent customer survey reveals the deleterious effects of data fragmentation. by Trevor Naidoo, December 2010   Across industries and geographies, IT organizations have grown in complexity, whether due to mergers and acquisitions, or decentralized systems supporting functional or departmental requirements. With systems architected over time to support unique, one-off process needs, they are becoming costly to maintain, and the Internet has only further added to the complexity. Data fragmentation has become a key inhibitor in delivering flexible, user-friendly systems. The Oracle Insight team conducted a survey assessing customers' master data management (MDM) capabilities over the past two years to get a sense of where they are in terms of their capabilities. The responses, by 27 respondents from six different industries, reveal five key areas in which customers need to improve their data management in order to get better financial results. 1. Less than 15 percent of organizations surveyed understand the sources and quality of their master data, and have a roadmap to address missing data domains. Examples of the types of master data domains referred to are customer, supplier, product, financial and site. Many organizations have multiple sources of master data with varying degrees of data quality in each source -- customer data stored in the customer relationship management system is inconsistent with customer data stored in the order management system. Imagine not knowing how many places you stored your customer information, and whether a customer's address was the most up to date in each source. In fact, more than 55 percent of the respondents in the survey manage their data quality on an ad-hoc basis. It is important for organizations to document their inventory of data sources and then profile these data sources to ensure that there is a consistent definition of key data entities throughout the organization. Some questions to ask are: How do we define a customer? What is a product? How do we define a site? The goal is to strive for one common repository for master data that acts as a cross reference for all other sources and ensures consistent, high-quality master data throughout the organization. 2. Only 18 percent of respondents have an enterprise data management strategy to ensure that data is treated as an asset to the organization. Most respondents handle data at the department or functional level and do not have an enterprise view of their master data. The sales department may track all their interactions with customers as they move through the sales cycle, the service department is tracking their interactions with the same customers independently, and the finance department also has a different perspective on the same customer. The salesperson may not be aware that the customer she is trying to sell to is experiencing issues with existing products purchased, or that the customer is behind on previous invoices. The lack of a data strategy makes it difficult for business users to turn data into information via reports. Without the key building blocks in place, it is difficult to create key linkages between customer, product, site, supplier and financial data. These linkages make it possible to understand patterns. A well-defined data management strategy is aligned to the business strategy and helps create the governance needed to ensure that data stewardship is in place and data integrity is intact. 3. Almost 60 percent of respondents have no strategy to integrate data across operational applications. Many respondents have several disparate sources of data with no strategy to keep them in sync with each other. Even though there is no clear strategy to integrate the data (see #2 above), the data needs to be synced and cross-referenced to keep the business processes running. About 55 percent of respondents said they perform this integration on an ad hoc basis, and in many cases, it is done manually with the help of Microsoft Excel spreadsheets. For example, a salesperson needs a report on global sales for a specific product, but the product has different product numbers in different countries. Typically, an analyst will pull all the data into Excel, manually create a cross reference for that product, and then aggregate the sales. The exact same procedure has to be followed if the same report is needed the following month. A well-defined consolidation strategy will ensure that a central cross-reference is maintained with updates in any one application being propagated to all the other systems, so that data is synchronized and up to date. This can be done in real time or in batch mode using integration technology. 4. Approximately 50 percent of respondents spend manual efforts cleansing and normalizing data. Information stored in various systems usually follows different standards and formats, making it difficult to match the data. A customer's address can be stored in different ways using a variety of abbreviations -- for example, "av" or "ave" for avenue. Similarly, a product's attributes can be stored in a number of different ways; for example, a size attribute can be stored in inches and can also be entered as "'' ". These types of variations make it difficult to match up data from different sources. Today, most customers rely on manual, heroic efforts to match, cleanse, and de-duplicate data -- clearly not a scalable, sustainable model. To solve this challenge, organizations need the ability to standardize data for customers, products, sites, suppliers and financial accounts; however, less than 10 percent of respondents have technology in place to automatically resolve duplicates. It is no wonder, therefore, that we get communications about products we don't own, at addresses we don't reside, and using channels (like direct mail) we don't like. An all-too-common example of a potential challenge follows: Customers end up receiving duplicate communications, which not only impacts customer satisfaction, but also incurs additional mailing costs. Cleansing, normalizing, and standardizing data will help address most of these issues. 5. Only 10 percent of respondents have the ability to share data that was mastered in a master data hub. Close to 60 percent of respondents have efforts in place that profile, standardize and cleanse data manually, and the output of these efforts are stored in spreadsheets in various parts of the organization. This valuable information is not easily shared with the rest of the organization and, more importantly, this enriched information cannot be sent back to the source systems so that the data is fixed at the source. A key benefit of a master data management strategy is not only to clean the data, but to also share the data back to the source systems as well as other systems that need the information. Aside from the source systems, another key beneficiary of this data is the business intelligence system. Having clean master data as input to business intelligence systems provides more accurate and enhanced reporting.  Characteristics of Stellar MDM When deciding on the right master data management technology, organizations should look for solutions that have four main characteristics: enterprise-grade MDM performance complete technology that can be rapidly deployed and addresses multiple business issues end-to-end MDM process management with data quality monitoring and assurance pre-built MDM business relevant applications with data stores and workflows These master data management capabilities will aid in moving closer to a best-practice maturity level, delivering tremendous efficiencies and savings as well as revenue growth opportunities as a result of better understanding your customers.  Trevor Naidoo is a senior director in Industry Strategy and Insight at Oracle. 

    Read the article

  • JPA entity design / cannot delete entity

    - by timaschew
    I though its simple what I want, but I cannot find any solution for my problem. I'm using playframework 1.2.3 and it's using Hibernate as JPA. So I think playframework has nothing to do with the problem. I have some classes (I omit the nonrelevant fields) public class User { ... } public class Task { public DataContainer dataContainer; } public class DataContainer { public Session session; public User user; } public class Session { ... } So I have association from Task to DataContainer and from DataContainer to Sesssion and the DataContainer belongs to a User. The DataContainers can have always the same User, but the Session have to be different for each instance. And the DataContainer of a Task have also to be different in each instance. A DataContainer can have a Sesesion or not (it's optinal). I use only unidirectional assoc. It should be sufficient. In other words: Every Task must has one DataContainer. Every DataContainer must has one/the same User and can have one Session. To create a DB schema I use JPA annotations: @Entity public class User extends Model { ... } @Entity public class Task extends Model { @OneToOne(optional = false, cascade = CascadeType.ALL) public DataContainer dataContainer; } @Entity public class DataContainer extends Model { @OneToOne(optional = true, cascade = CascadeType.ALL) public Session session; @ManyToOne(optional = false, cascade = CascadeType.ALL) public User user; } @Entity public class Session extends Model { ... } BTW: Model is a play class and provides the primary id as long type. When I create some for each entity a object and 'connect them', I mean the associations, it works fine. But when I try to delete a Session, I get a constraint violation exception, because a DataContainer still refers to the Session I want to delete. I want that the Session (field) of the DataContainer will be set to null respectively the foreign key (session_id) should be unset in the database. This will be okay, because its optional. I don't know, I think I have multiple problems. Am I using the right annotation @OneToOne ? I found on the internet some additional annotation and attributes: @JoinColumn and a mappedBy attribute for the inverse relationship. But I don't have it, because its not bidirectional. Or is a bidirectional assoc. essentially? Another try was to use @OnDelete(action = OnDeleteAction.CASCADE) the the contraint changed from NO ACTIONs when update or delete to: ADD CONSTRAINT fk4745c17e6a46a56 FOREIGN KEY (session_id) REFERENCES annotation_session (id) MATCH SIMPLE ON UPDATE NO ACTION ON DELETE CASCADE; But in this case, when I delete a session, the DataContainer and User is deleted. That's wrong for me. EDIT: I'm using postgresql 9, the jdbc stuff is included in play, my only db config is db=postgres://app:app@localhost:5432/app

    Read the article

  • Address Regulatory Mandates for Data Encryption Without Changing Your Applications

    - by Troy Kitch
    The Payment Card Industry Data Security Standard, US state-level data breach laws, and numerous data privacy regulations worldwide all call for data encryption to protect personally identifiable information (PII). However encrypting PII data in applications requires costly and complex application changes. Fortunately, since this data typically resides in the application database, using Oracle Advanced Security, PII can be encrypted transparently by the Oracle database without any application changes. In this ISACA webinar, learn how Oracle Advanced Security offers complete encryption for data at rest, in transit, and on backups, along with built-in key management to help organizations meet regulatory requirements and save money. You will also hear from TransUnion Interactive, the consumer subsidiary of TransUnion, a global leader in credit and information management, which maintains credit histories on an estimated 500 million consumers across the globe, about how they addressed PCI DSS encryption requirements using Oracle Database 11g with Oracle Advanced Security. Register to watch the webinar now.

    Read the article

  • Guest (and occasional co-host) on Jesse Liberty's Yet Another Podcast

    - by Jon Galloway
    I was a recent guest on Jesse Liberty's Yet Another Podcast talking about the latest Visual Studio, ASP.NET and Azure releases. Download / Listen: Yet Another Podcast #75–Jon Galloway on ASP.NET/ MVC/ Azure Co-hosted shows: Jesse's been inviting me to co-host shows and I told him I'd show up when I was available. It's a nice change to be a drive-by co-host on a show (compared with the work that goes into organizing / editing / typing show notes for Herding Code shows). My main focus is on Herding Code, but it's nice to pop in and talk to Jesse's excellent guests when it works out. Some shows I've co-hosted over the past year: Yet Another Podcast #76–Glenn Block on Node.js & Technology in China Yet Another Podcast  #73 - Adam Kinney on developing for Windows 8 with HTML5 Yet Another Podcast #64 - John Papa & Javascript Yet Another Podcast #60 - Steve Sanderson and John Papa on Knockout.js Yet Another Podcast #54–Damian Edwards on ASP.NET Yet Another Podcast #53–Scott Hanselman on Blogging Yet Another Podcast #52–Peter Torr on Windows Phone Multitasking Yet Another Podcast #51–Shawn Wildermuth: //build, Xaml Programming & Beyond And some more on the way that haven't been released yet. Some of these I'm pretty quiet, on others I get wacky and hassle the guests because, hey, not my podcast so not my problem. Show notes from the ASP.NET / MVC / Azure show: What was just released Visual Studio 2012 Web Developer features ASP.NET 4.5 Web Forms Strongly Typed data controls Data access via command methods Similar Binding syntax to ASP.NET MVC Some context: Damian Edwards and WebFormsMVP Two questions from Jesse: Q: Are you making this harder or more complicated for Web Forms developers? Short answer: Nothing's removed, it's just a new option History of SqlDataSource, ObjectDataSource Q: If I'm using some MVC patterns, why not just move to MVC? Short answer: This works really well in hybrid applications, doesn't require a rewrite Allows sharing models, validation, other code between Web Forms and MVC ASP.NET MVC Adaptive Rendering (oh, also, this is in Web Forms 4.5 as well) Display Modes Mobile project template using jQuery Mobile OAuth login to allow Twitter, Google, Facebook, etc. login Jon (and friends') MVC 4 book on the way: Professional ASP.NET MVC 4 Windows 8 development Jesse and Jon announce they're working on a new book: Pro Windows 8 Development with XAML and C# Jon and Jesse agree that it's nice to be able to write Windows 8 applications using the same skills they picked up for Silverlight, WPF, and Windows Phone development. Compare / contrast ASP.NET MVC and Windows 8 development Q: Does ASP.NET and HTML5 development overlap? Jon thinks they overlap in the MVC world because you're writing HTML views without controls Jon describes how his web development career moved from a preoccupation with server code to a focus on user interaction, which occurs in the browser Jon mentions his NDC Oslo presentation on Learning To Love HTML as Beautiful Code Q: How do you apply C# / XAML or HTML5 skills to Windows 8 development? Q: If I'm a XAML programmer, what's the learning curve on getting up to speed on ASP.NET MVC? Jon describes the difference in application lifecycle and state management Jon says it's nice that web development is really interactive compared to application development Q: Can you learn MVC by reading a book? Or is it a lot bigger than that? What is Azure, and why would I use it? Jon describes the traditional Azure platform mode and how Azure Web Sites fits in Q: Why wouldn't Jesse host his blog on Azure Web Sites? Domain names on Azure Web Sites File hosting options Q: Is Azure just another host? How is it different from any of the other shared hosting options? A: Azure gives you the ability to scale up or down whenever you want A: Other services are available if or when you want them

    Read the article

  • Sesame Data Browser: filtering, sorting, selecting and linking

    - by Fabrice Marguerie
    I have deferred the post about how Sesame is built in favor of publishing a new update.This new release offers major features such as the ability to quickly filter and sort data, select columns, and create hyperlinks to OData. Filtering, sorting, selecting In order to filter data, you just have to use the filter row, which becomes available when you click on the funnel button: You can then type some text and select an operator: The data grid will be refreshed immediately after you apply a filter. It works in the same way for sorting. Clicking on a column will immediately update the query and refresh the grid.Note that multi-column sorting is possible by using SHIFT-click: Viewing data is not enough. You can also view and copy the query string that returns that data: One more thing you can to shape data is to select which columns are displayed. Simply use the Column Chooser and you'll be done: Again, this will update the data and query string in real time: Linking to Sesame, linking to OData The other main feature of this release is the ability to create hyperlinks to Sesame. That's right, you can ask Sesame to give you a link you can display on a webpage, send in an email, or type in a chat session. You can get a link to a connection: or to a query: You'll note that you can also decide to embed Sesame in a webpage... Here are some sample links created via Sesame: Netflix movies with high ratings, sorted by release year Netflix horror movies from the 21st century Northwind discontinued products with remaining stock Netflix empty connection I'll give more examples in a post to follow. There are many more minor improvements in this release, but I'll let you find out about them by yourself :-)Please try Sesame Data Browser now and let me know what you think! PS: if you use Sesame from the desktop, please use the "Remove this application" command in the context menu of the destkop app and then "Install on desktop" again in your web browser. I'll activate automatic updates with the next release.

    Read the article

  • Using Hadooop (HDInsight) with Microsoft - Two (OK, Three) Options

    - by BuckWoody
    Microsoft has many tools for “Big Data”. In fact, you need many tools – there’s no product called “Big Data Solution” in a shrink-wrapped box – if you find one, you probably shouldn’t buy it. It’s tempting to want a single tool that handles everything in a problem domain, but with large, complex data, that isn’t a reality. You’ll mix and match several systems, open and closed source, to solve a given problem. But there are tools that help with handling data at large, complex scales. Normally the best way to do this is to break up the data into parts, and then put the calculation engines for that chunk of data right on the node where the data is stored. These systems are in a family called “Distributed File and Compute”. Microsoft has a couple of these, including the High Performance Computing edition of Windows Server. Recently we partnered with Hortonworks to bring the Apache Foundation’s release of Hadoop to Windows. And as it turns out, there are actually two (technically three) ways you can use it. (There’s a more detailed set of information here: http://www.microsoft.com/sqlserver/en/us/solutions-technologies/business-intelligence/big-data.aspx, I’ll cover the options at a general level below)  First Option: Windows Azure HDInsight Service  Your first option is that you can simply log on to a Hadoop control node and begin to run Pig or Hive statements against data that you have stored in Windows Azure. There’s nothing to set up (although you can configure things where needed), and you can send the commands, get the output of the job(s), and stop using the service when you are done – and repeat the process later if you wish. (There are also connectors to run jobs from Microsoft Excel, but that’s another post)   This option is useful when you have a periodic burst of work for a Hadoop workload, or the data collection has been happening into Windows Azure storage anyway. That might be from a web application, the logs from a web application, telemetrics (remote sensor input), and other modes of constant collection.   You can read more about this option here:  http://blogs.msdn.com/b/windowsazure/archive/2012/10/24/getting-started-with-windows-azure-hdinsight-service.aspx Second Option: Microsoft HDInsight Server Your second option is to use the Hadoop Distribution for on-premises Windows called Microsoft HDInsight Server. You set up the Name Node(s), Job Tracker(s), and Data Node(s), among other components, and you have control over the entire ecostructure.   This option is useful if you want to  have complete control over the system, leave it running all the time, or you have a huge quantity of data that you have to bulk-load constantly – something that isn’t going to be practical with a network transfer or disk-mailing scheme. You can read more about this option here: http://www.microsoft.com/sqlserver/en/us/solutions-technologies/business-intelligence/big-data.aspx Third Option (unsupported): Installation on Windows Azure Virtual Machines  Although unsupported, you could simply use a Windows Azure Virtual Machine (we support both Windows and Linux servers) and install Hadoop yourself – it’s open-source, so there’s nothing preventing you from doing that.   Aside from being unsupported, there are other issues you’ll run into with this approach – primarily involving performance and the amount of configuration you’ll need to do to access the data nodes properly. But for a single-node installation (where all components run on one system) such as learning, demos, training and the like, this isn’t a bad option. Did I mention that’s unsupported? :) You can learn more about Windows Azure Virtual Machines here: http://www.windowsazure.com/en-us/home/scenarios/virtual-machines/ And more about Hadoop and the installation/configuration (on Linux) here: http://en.wikipedia.org/wiki/Apache_Hadoop And more about the HDInsight installation here: http://www.microsoft.com/web/gallery/install.aspx?appid=HDINSIGHT-PREVIEW Choosing the right option Since you have two or three routes you can go, the best thing to do is evaluate the need you have, and place the workload where it makes the most sense.  My suggestion is to install the HDInsight Server locally on a test system, and play around with it. Read up on the best ways to use Hadoop for a given workload, understand the parts, write a little Pig and Hive, and get your feet wet. Then sign up for a test account on HDInsight Service, and see how that leverages what you know. If you're a true tinkerer, go ahead and try the VM route as well. Oh - there’s another great reference on the Windows Azure HDInsight that just came out, here: http://blogs.msdn.com/b/brunoterkaly/archive/2012/11/16/hadoop-on-azure-introduction.aspx  

    Read the article

  • Oracle - A Leader in Gartner's MQ for Master Data Management for Customer Data

    - by Mala Narasimharajan
      The Gartner MQ report for Master Data Management of Customer Data Solutions is released and we're proud to say that Oracle is in the leaders' quadrant.  Here's a snippet from the report itself:  " “Oracle has a strong, though complex, portfolio of domain-specific MDM products that include prepackaged data models. Gartner estimates that Oracle now has over 1,500 licensed MDM customers, including 650 customers managing customer data. The MDM portfolio includes three products that address MDM of customer data solution needs: Oracle Fusion Customer Hub (FCH), Oracle CDH and Oracle Siebel UCM. These three MDM products are positioned for different segments of the market and Oracle is progressively moving all three products onto a common MDM technology platform..." (Gartner, Oct 18, 2012)  For more information on Oracle's solutions for customer data in Master Data Management, click here.  

    Read the article

  • Almost Realtime Data and Web application

    - by Chris G.
    I have a computer that is recording 100 different data points into an OPC server. I've written a simple OPC client that can read all of this data. I have a front-end website on a different network that I would like to consume this data. I could easily set the OPC client to send the data to a SQL server and the website could read from it, but that would be a lot of writes. If I wanted the data to be updated every 10 seconds I'd be writing to the database every 10 seconds. (I could probably just serialize the 100 points to get 1 write / 10 seconds but that would also limit my ability to search the data later). This solution wouldn't scale very well. If I had 100 of these computers the situation would quickly grow out of hand. Obviously I am well out of my league here and I have no experience with working with a large amount of data like this. What are my options and what should I research?

    Read the article

  • Can the following Domain Entity contain logic for creating/deleting other entities?

    - by user702769
    a) As far as I understand it, in most cases Domain Model DM doesn't contain code for creating/deleting domain entities, but instead it is the job of layers ( ie service layer or UI layer ) on top of DM to create/delete domain entities? b) Domain entities are modelled after real world entities. Assuming particular real world entity being abstracted does have the functionality of creating/deleting other real world entities, then I assume the domain entity abstracting this real world entity could also contain logic for creating/deleting other entities? class RobotDestroyerCreator { ... void heavyThinking() { ... if(...) unitOfWork.registerDelete(robot); ... if(...) { var robotNew = new Robot(...); unitOfWork.registerNew(robotNew); { ... } } Thank you

    Read the article

  • Retrieve data from an ASP.Net application using ADO.Net 2.0 connected model

    - by nikolaosk
    I have been teaching Entity Framework,LINQ to SQL,LINQ to objects,LINQ to XML for some time now. I am huge fan of LINQ to Entities and I am using Entity Framework as my main data access technology. Entity framework is in the second version right now and I can accomplish most of the things I need. I am sure the guys in the ADO.Net team will implement many more features in the future. I am a strong believer that you cannot really understand the benefits of LINQ to SQL or LINQ to Entities unless you...(read more)

    Read the article

  • Asp.Net membership via ASP.NET Website Administrator Tool

    - by luppi
    I created a database with aspnet_regsql, the database was created in sql sever 2008 and not in data folder in my project (do I need to move it to the folder manually?). Next, in Web Site Administration Tool I went to provider section and clicked don Test button. I got an error: Could not establish a connection to the database. If you have not yet created the SQL Server database, exit the Web Site Administration tool, use the aspnet_regsql command-line utility to create and configure the database, and then return to this tool to set the provider. Maybe I need to set something in a web.config, like membership settings or connection strings (or ASP.NET Website Administrator Tool should create those settings for me)? Update: Maybe it happens because I am using SQL server 2008 full and not express? Update 2: After setting membership section and connection string to my aspnetdb database in Web Site Administration Tool I've opened security-Security Setup Wizard-Define Roles (stage 4) I got this error: An error was encountered. Please return to the previous page and try again. The following message may help in diagnosing the problem: Unable to connect to SQL Server database. at System.Web.Administration.WebAdminPage.CallWebAdminHelperMethod(Boolean isMembership, String methodName, Object[] parameters, Type[] paramTypes) at ASP.security_wizard_wizardpermission_ascx.OnInit(EventArgs e) at System.Web.UI.Control.InitRecursive(Control namingContainer) at System.Web.UI.Control.InitRecursive(Control namingContainer) at System.Web.UI.Control.InitRecursive(Control namingContainer) at System.Web.UI.Control.InitRecursive(Control namingContainer) at System.Web.UI.Control.InitRecursive(Control namingContainer) at System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint)

    Read the article

  • ASP MVC2 model binding issue on POST

    - by Brandon Linton
    So I'm looking at moving from MVC 1.0 to MVC 2.0 RTM. One of the conventions I'd like to start following is using the strongly-typed HTML helpers for generating controls like text boxes. However, it looks like it won't be an easy jump. I tried migrating my first form, replacing lines like this: <%= Html.TextBox("FirstName", Model.Data.FirstName, new {maxlength = 30}) %> ...for lines like this: <%= Html.TextBoxFor(x => x.Data.FirstName, new {maxlength = 30}) %> Previously, this would map into its appropriate view model on a POST, using the following method signature: [AcceptVerbs(HttpVerbs.Post)] public ActionResult Registration(AccountViewInfo viewInfo) Instead, it currently gets an empty object back. I believe the disconnect is in the fact that we pass the view model into a larger aggregate object that has some page metadata and other fun stuff along with it (hence x.Data.FirstName instead of x.FirstName). So my question is: what is the best way to use the strongly-typed helpers while still allowing the MVC framework to appropriately cast the form collection to my view-model as it does in the original line? Is there any way to do it without changing the aggregate type we pass to the view? Thanks!

    Read the article

  • Import Excel 2007 into SQL 2000 using Classic ASP and ADO

    - by jeff
    I have the following code from a legacy app which currently reads from an excel 2003 spreadsheet on a server, but I need this to run from my machine which uses excel 2007. When I debug on my machine ADO does not seem to be reading the spreadsheet. I have checked all file paths etc. and location of spreadsheet that is all fine. I've heard that you cannot use the jet db engine for excel 2007 anymore? Can someone confirm this? What do I need to do to get this to work? Please help! set obj_conn = Server.CreateObject("ADODB.Connection") obj_conn.Open "Provider=Microsoft.Jet.OLEDB.4.0;" & _ "Data Source=" & Application("str_folder") & "CNS43.xls;" & _ "Extended Properties=""Excel 8.0;""" set obj_rs_cns43 = Server.CreateObject("ADODB.RecordSet") obj_rs_cns43.ActiveConnection = obj_conn obj_rs_cns43.CursorType = 3 obj_rs_cns43.LockType = 2 obj_rs_cns43.Source = "SELECT * FROM [CNS43$]" obj_rs_cns43.Open

    Read the article

  • Custom model validation of dependent properties using Data Annotations

    - by Darin Dimitrov
    Since now I've used the excellent FluentValidation library to validate my model classes. In web applications I use it in conjunction with the jquery.validate plugin to perform client side validation as well. One drawback is that much of the validation logic is repeated on the client side and is no longer centralized at a single place. For this reason I'm looking for an alternative. There are many examples out there showing the usage of data annotations to perform model validation. It looks very promising. One thing I couldn't find out is how to validate a property that depends on another property value. Let's take for example the following model: public class Event { [Required] public DateTime? StartDate { get; set; } [Required] public DateTime? EndDate { get; set; } } I would like to ensure that EndDate is greater than StartDate. I could write a custom validation attribute extending ValidationAttribute in order to perform custom validation logic. Unfortunately I couldn't find a way to obtain the model instance: public class CustomValidationAttribute : ValidationAttribute { public override bool IsValid(object value) { // value represents the property value on which this attribute is applied // but how to obtain the object instance to which this property belongs? return true; } } I found that the CustomValidationAttribute seems to do the job because it has this ValidationContext property that contains the object instance being validated. Unfortunately this attribute has been added only in .NET 4.0. So my question is: can I achieve the same functionality in .NET 3.5 SP1? UPDATE: It seems that FluentValidation already supports clientside validation and metadata in ASP.NET MVC 2. Still it would be good to know though if data annotations could be used to validate dependent properties.

    Read the article

  • ado.net Concurrency violation

    - by Bicubic
    My first time using ADO.net. Trying to make database of Users. First I populate my DataSet: adapter.AcceptChangesDuringFill = true; adapter.AcceptChangesDuringUpdate = true; adapter.Fill(dataset); To create a user: User user = new User(); user.datarow = dataset.Users.NewUsersRow(); user.Name = username; user.PasswordHash = GetHash(password); user.Rights = UserRights.None; users.Add(user); dataset.Users.AddUsersRow(user.datarow); adapter.Update(dataset); When a user property is modified: adapter.Update(dataset); Creation by itself is fine. If I take an existing user and make multiple changes, fine. Multiple creations in a row, fine. Creation followed by a property change, I get this: "Concurrency violation: the UpdateCommand affected 0 of the expected 1 records." Any ideas?

    Read the article

  • Custom model validation of dependent properties using Data Annotations

    - by Darin Dimitrov
    Since now I've used the excellent FluentValidation library to validate my model classes. In web applications I use it in conjunction with the jquery.validate plugin to perform client side validation as well. One drawback is that much of the validation logic is repeated on the client side and is no longer centralized at a single place. For this reason I'm looking for an alternative. There are many examples out there showing the usage of data annotations to perform model validation. It looks very promising. One thing I couldn't find out is how to validate a property that depends on another property value. Let's take for example the following model: public class Event { [Required] public DateTime? StartDate { get; set; } [Required] public DateTime? EndDate { get; set; } } I would like to ensure that EndDate is greater than StartDate. I could write a custom validation attribute extending ValidationAttribute in order to perform custom validation logic. Unfortunately I couldn't find a way to obtain the model instance: public class CustomValidationAttribute : ValidationAttribute { public override bool IsValid(object value) { // value represents the property value on which this attribute is applied // but how to obtain the object instance to which this property belongs? return true; } } I found that the CustomValidationAttribute seems to do the job because it has this ValidationContext property that contains the object instance being validated. Unfortunately this attribute has been added only in .NET 4.0. So my question is: can I achieve the same functionality in .NET 3.5 SP1? UPDATE: It seems that FluentValidation already supports clientside validation and metadata in ASP.NET MVC 2. Still it would be good to know though if data annotations could be used to validate dependent properties.

    Read the article

  • VB.NET - ASP.NET - MS-Access - SQL Statement

    - by Brian
    I have a button which when pressed, sets the user's rights in the db. (If Administrator UserTypeID is set to '2' and if Customer it is set to '1'). However when I run the below code, everything remains the same. I think it's from the SQL statement but I;m not sure. Can anyone help please? Protected Sub btnSetUser_Click(sender As Object, e As System.EventArgs) Handles btnSetUser.Click Dim conn As New OleDbConnection("Provider=Microsoft.Jet.OLEDB.4.0;Data Source=C:\Users\Brian\Documents\Visual Studio 2010\WebSites\WebSite3\db.mdb;") Dim cmd As OleDbCommand = New OleDbCommand("UPDATE [User] SET [UserTypeID] WHERE Username=?", conn) conn.Open() cmd.Parameters.AddWithValue("@Username", txtUser.Text) If ddUserType.SelectedItem.Text = "Administrator" Then cmd.Parameters.AddWithValue("@UserTypeID", "2") cmd.ExecuteNonQuery() lblSetUser.Text = txtUser.Text + "was set to Administrator." ElseIf ddUserType.SelectedItem.Text = "Customer" Then cmd.Parameters.AddWithValue("@UserTypeID", "1") cmd.ExecuteNonQuery() lblSetUser.Text = txtUser.Text + "was set to Customer." End If conn.Close() End Sub End Class

    Read the article

  • Pass a model object while using a upload

    - by Dejan.S
    I'm trying to pass my model object along with the file I'm uploading but I'm stuck on how I should that should be done. This is the code I use now [AcceptVerbs(HttpVerbs.Post)] public ActionResult Upload(PageBody pageBody) { foreach (string file in Request.Files) { var hpf = Request.Files[file] as HttpPostedFileBase; if (hpf.ContentLength == 0) continue; string savedFileName = Path.Combine(AppDomain.CurrentDomain.BaseDirectory + "Content/Uploads/", Path.GetFileName(hpf.FileName)); hpf.SaveAs(savedFileName); _pageBody.Pictures.Add(new PageBodyPicture() { Picture = file }); } return View(pageBody); } here is my view code, I got the model.Id but there but it wont pass even if I put just Id to the Upload method. <form action="/Admin/Upload" enctype="multipart/form-data" method="post"> <%= Html.HiddenFor(model => model.Id)%> <input type="file" name="file" id="file" /><br /> <input type="submit" name="submit" value="Upload" /> </form>

    Read the article

< Previous Page | 66 67 68 69 70 71 72 73 74 75 76 77  | Next Page >