Search Results

Search found 13283 results on 532 pages for 'master pdf editor'.

Page 136/532 | < Previous Page | 132 133 134 135 136 137 138 139 140 141 142 143  | Next Page >

  • git: how to not delete files when rebasing commits with file deletion

    - by Benjol
    I have a branch that I would like to rebase onto the lastest commit on my master. The problem is that one of the intervening commits on master was to delete and ignore a particular set of files (see this question). If I just do a straight rebase, those files will get deleted again. Is there anyway of doing this, inside git, rather than copying all the files out by hand, then copying them back in again afterwards? Or should I do something like create a new branch off master, then merge in just the commits from the old branch? Attempts ascii art: master branch | w work in progress on branch C | committed further changes on master | | B / committed delete/ignore files on master | 2 committed changes on branch | / A / committed changes on master which I now need to get branch working | 1 committed changes on branch 0___/ created branch (Doing the art, I realise that I could just rebase branch from A, then merge when I've finished, but I'd still like to know if there's a way to do this 'properly') UPDATE Warning to anyone trying this. The solution proposed here is fine, but when you checkout master again, the B commit will be re-applied, and you lose all your files again :(

    Read the article

  • How to know if all the Thread Pool's thread are already done with its tasks?

    - by mcxiand
    I have this application that will recurse all folders in a given directory and look for PDF. If a PDF file is found, the application will count its pages using ITextSharp. I did this by using a thread to recursively scan all the folders for pdf, then if then PDF is found, this will be queued to the thread pool. The code looks like this: //spawn a thread to handle the processing of pdf on each folder. var th = new Thread(() => { pdfDirectories = Directory.GetDirectories(pdfPath); processDir(pdfDirectories); }); th.Start(); private void processDir(string[] dirs) { foreach (var dir in dirs) { pdfFiles = Directory.GetFiles(dir, "*.pdf"); processFiles(pdfFiles); string[] newdir = Directory.GetDirectories(dir); processDir(newdir); } } private void processFiles(string[] files) { foreach (var pdf in files) { ThreadPoolHelper.QueueUserWorkItem( new { path = pdf }, (data) => { processPDF(data.path); } ); } } My problem is, how do i know that the thread pool's thread has finished processing all the queued items so i can tell the user that the application is done with its intended task?

    Read the article

  • Does Github.com have to create a merge commit when you merge from a fork ?

    - by Nishant
    I cloned the master and started doing he my work . Due to permissions I push the branch to my fork . I then sent a pull request to my master and someone with permission does the merge . I notice that Github.com creates a merge commit snapshot which to me looks like just a diff of the entire changes which is actually not necessary but helpful in the sense I can just look at merge commit to see the entire diff . I can see the same sha has as my own branch - hence it looks like the merge is an extra commit which probably aint nexeccary since its a fast forward ? master - a myfork(computer) - a->b->c myfork(github) - a->b->c Pull request myfork - master (which it says I can automatically merge) shows the entire diff and then when I merge it , it shows up as master - a->b->c-d . The d is a merge commit which I think it not really required because it is a fast forward ? Can someone explain why does this happen ? I think this is the same scenario if I rebase master if master had gone ahead , but that has not happened . Master is still at when I merge .

    Read the article

  • git push says everything up to date when it definitely is not

    - by Wolf
    I have a public repository. No one else has forked, pulled, or done anything else to it. I made some minor changes to one file, successfully committed them, and tried to push. It says 'Everything up-to-date'. There are no branches. I'm very, very new to git and I don't understand what on earth is going on. git remote show origin tells me: HEAD branch: master Remote branch: master tracked Local ref configured for 'git push': master pushes to master (up to date) Any ideas what I can do to make this understand that it's NOT up to date? Thanks Updates: git status: # On branch master # Untracked files: # (use "git add ..." to include in what will be committed) # # histmarkup.el # vendor/yasnippet-0.6.1c/snippets/ no changes added to commit (use "git add" and/or "git commit -a") git branch -a: * master remotes/origin/master git fsck: dangling tree 105cb101ca1a4d2cbe1b5c73eb4a238e22cb4998 dangling tree 85bd0461f0fcb1618d46c8a80d3a4a7932de34bb Update 2: I re-opened the modified file, and the modifications I KNOW I had made were gone. So I added them again, went through the rigamarole of git status, git add filename, git commit -m "(message)", and git push origin master, and all of a sudden it works the way it's supposed to.

    Read the article

  • Get filename from path

    - by Eric
    I am trying to parse the filename from paths. I have this: my $filepath = "/Users/Eric/Documents/foldername/filename.pdf"; $filepath =~ m/^.*\\(.*[.].*)$/; print "Linux path:"; print $1 . "\n\n"; print "-------\n"; my $filepath = "c:\\Windows\eric\filename.pdf"; $filepath =~ m/^.*\\(.*[.].*)$/; print "Windows path:"; print $1 . "\n\n"; print "-------\n"; my $filepath = "filename.pdf"; $filepath =~ m/^.*\\(.*[.].*)$/; print "Without path:"; print $1 . "\n\n"; print "-------\n"; But that returns: Linux path: ------- Windows path:Windowsic ilename.pdf ------- Without path:Windowsic ilename.pdf ------- I am expecting this: Linux path: filename.pdf ------- Windows path: filename.pdf ------- Without path: filename.pdf ------- Can somebody please point out what I am doing wrong? Thanks! :)

    Read the article

  • Prawn image position

    - by John
    I'm trying to layout 6 images per page with prawn in Ruby: case (idx % 6) # ugly when 0 : (pdf.start_new_page; pdf.image img, :position => :left, :vposition => :top, :width => 270) when 1 : pdf.image img, :position => :right, :vposition => :top, :width => 270 when 2 : pdf.image img, :position => :left, :vposition => :center, :width => 270 when 3 : pdf.image img, :position => :right, :vposition => :center, :width => 270 when 4 : pdf.image img, :position => :left, :vposition => :bottom, :width => 270 when 5 : pdf.image img, :position => :right, :vposition => :bottom, :width => 270 end Not sure what I'm doing wrong, but it prints the first 3 images to the PDF, then creates a new page and prints the last three: Page 1: <img> <img> <blank> <blank> <blank> <blank> Page 2: <blank> <blank> <blank> <img> <img> <img> Any suggestions would help.

    Read the article

  • How to rename many files url escaped (%XX) to human readable form

    - by F. Hauri
    I have downloaded a lot of files in one directory, but many of them are stored with URL escaped filename, containing sign percents folowed by two hexadecimal chars, like: ls -ltr $HOME/Downloads/ -rw-r--r-- 2 user user 13171425 24 nov 10:07 Swisscom%20Mobile%20Unlimited%20Kurzanleitung-%282011-05-12%29.pdf -rw-r--r-- 2 user user 1525794 24 nov 10:08 31010ENY-HUAWEI%20E173u-1%20HSPA%20USB%20Stick%20Quick%20Start-%28V100R001_01%2CEnglish%2CIndia-Reliance%2CC%2Ccolor%29.pdf ... All theses names match the following form whith exactly 3 parts: Name of the object -( Revision, and/or Date, useless ... ). Extension In same command, I would like to obtain unde My goal is to having one command to rename all this files to obtain: -rw-r--r-- 2 user user 13171425 24 nov 10:07 Swisscom_Mobile_Unlimited_Kurzanleitung.pdf -rw-r--r-- 2 user user 1525794 24 nov 10:08 31010ENY-HUAWEI_E173u-1_HSPA_USB_Stick_Quick_Start.pdf I've successfully do the job in full bash with: urlunescape() { local srce="$1" done=false part1 newname ext while ! $done ;do part1="${srce%%%*}" newname="$part1\\x${srce:${#part1}+1:2}${srce:${#part1}+3}" [ "$part1" == "$srce" ] && done=true || srce="$newname" done newname="$(echo -e $srce)" ext=${newname##*.} newname="${newname%-(*}" echo ${newname// /_}.$ext } for file in *;do mv -i "$file" "$(urlunescape "$file")" done ls -ltr -rw-r--r-- 2 user user 13171425 24 nov 10:07 Swisscom_Mobile_Unlimited_Kurzanleitung.pdf -rw-r--r-- 2 user user 1525794 24 nov 10:08 31010ENY-HUAWEI_E173u-1_HSPA_USB_Stick_Quick_Start.pdf or using sed, tr, bash ... and sed: for file in *;do echo -e $( echo $file | sed 's/%\(..\)/\\x\1/g' ) | sed 's/-(.*\.\([^\.]*\)$/.\1/' | tr \ \\n _\\0 | xargs -0 mv -i "$file" done ls -ltr -rw-r--r-- 2 user user 13171425 24 nov 10:07 Swisscom_Mobile_Unlimited_Kurzanleitung.pdf -rw-r--r-- 2 user user 1525794 24 nov 10:08 31010ENY-HUAWEI_E173u-1_HSPA_USB_Stick_Quick_Start.pdf But, I'm sure, there must exist simplier and/or shorter way to do this.

    Read the article

  • How can I get a filename from a path with Perl?

    - by Eric
    I am trying to parse the filename from paths. I have this: my $filepath = "/Users/Eric/Documents/foldername/filename.pdf"; $filepath =~ m/^.*\\(.*[.].*)$/; print "Linux path:"; print $1 . "\n\n"; print "-------\n"; my $filepath = "c:\\Windows\eric\filename.pdf"; $filepath =~ m/^.*\\(.*[.].*)$/; print "Windows path:"; print $1 . "\n\n"; print "-------\n"; my $filepath = "filename.pdf"; $filepath =~ m/^.*\\(.*[.].*)$/; print "Without path:"; print $1 . "\n\n"; print "-------\n"; But that returns: Linux path: ------- Windows path:Windowsic ilename.pdf ------- Without path:Windowsic ilename.pdf ------- I am expecting this: Linux path: filename.pdf ------- Windows path: filename.pdf ------- Without path: filename.pdf ------- Can somebody please point out what I am doing wrong? Thanks! :)

    Read the article

  • Monitoring outgoing messages using EXIM

    - by dashmug
    I work as an IT guy in a law firm. I am recently asked to make a system wherein all the outgoing emails coming from our server to our clients will be put on hold first and wait for approval before it gets sent to the client. Our mail server uses Exim (that's what it says in cPanel). I am planning to create filters where the outgoing emails will be forwarded to an editor account. Then, the editor will review and edit the contents of the email. When the editor already approves the email, it will then get sent to the client by the editor but still using the original sender in the "From:" and "Reply-To:" field. I found some pointers from this site = http://www.devco.net/archives/2006/03/24/saving_copies_of_all_email_using_exim.php. Once the filters are in place, I want to make a simple PHP interface for the editor to check the forwarded emails and edit them if necessary. The editor can then click on an "Approve" button that will finally deliver the message using the original sender. I'm also thinking that maybe a PHP-less system will be enough. The editor can receive the emails from his own email client edit them and simply send the email as if he is the original sender. Is my plan feasible? Will there be issues that I have overlooked? Does it have the danger of being treated as spam by the other mailservers since I'll be messing up the headers?

    Read the article

  • Migrateing to Windows Server 2008 R2 Domain Controllers - a few Questions/Issues

    - by Chris
    Ok so here's our setup: We have 2 Windows2k3 Domain Controllers. I am trying to replace them with Windows 2008 R2. The Win2k3 servers are DC01 and DC02. The Windows2k8 servers are DC1 and DC2. I prepared the Windows Server 2003 Forest Schema for a Domain Controller That Runs Windows Server 2008 or Windows Server 2008 R2. Then with both of the new servers up as member servers I dcpromo'd DC1 using the advanced option and added it successfully to my exisiting domain. Roles are GC, DNS and Active Directory Domain Services.I transferred The PDC, RID pool manager and Infrastructure master FSMO to the new DC.(DC1) The Schema master and Domain naming master are still on the old DC (DC01). The first issue I'm encountering is when i dcpromo the second DC (DC2) and select "Replicate data over the network from and existing domain controller" I select the new DC to replicate from (DC1) I get the following error: "Failed to identify the requested replica partner (dc1.xxx.org) as a valid domain controller with a machine account for (DC2$). This is likely due to either the machine account not being replicated to this domain controller because of replication latency or the domain controller not advertising the Active Directory Domain Services. Please consider retrying the operation with \dc01.xxx.org as the replica partner. "The server is unwilling to process the request." Is this because the Schema master and Domain naming master roles are still on the old DC (DC01)? And if so, if I transfer Schema master and Domain naming master roles to DC1 what is the risk or breaking my AD? I'm a little paranoid because this process HAS to be transparent. ANY down time or interruption will result in me getting a verbal ass kicking from my I.T. Director. Both of the new servers DNS point the the old DNS servers (DC01 and DC02) not themselves by the way. Thanks in Advance -Chris

    Read the article

  • Openldap startup problems after upgrade

    - by Craig Efrein
    I am trying to syncrhonize a ldap slave and master server. The master server is using openldap 2.3.43-12 and the slave server is using openldap 2.4.23. I copied over the files in /var/lib/ldap, started the server and got this error: Oct 22 16:16:41 xe-ldap-slave1 slapd[12111]: bdb(dc=myserver,dc=fr): Program version 4.7 doesn't match environment version 4.4 Oct 22 16:16:41 xe-ldap-slave1 slapd[12111]: bdb_db_open: database "dc=myserver,dc=fr" cannot be opened, err -30971. Restore from backup! Oct 22 16:16:41 xe-ldap-slave1 slapd[12111]: bdb(dc=myserver,dc=fr): txn_checkpoint interface requires an environment configured for the transaction subsystem Oct 22 16:16:41 xe-ldap-slave1 slapd[12111]: bdb_db_close: database "dc=myserver,dc=fr": txn_checkpoint failed: Invalid argument (22). Oct 22 16:16:41 xe-ldap-slave1 slapd[12111]: backend_startup_one (type=bdb, suffix="dc=myserver,dc=fr"): bi_db_open failed! (-30971) Oct 22 16:16:41 xe-ldap-slave1 slapd[12111]: bdb_db_close: database "dc=myserver,dc=fr": alock_close failed I have used the db_upgrade command to upgrade the database files on the new slave server, but I still get the same error when starting slapd. The master server is Centos 5.5 32bit & openldap 2.3.43-12 The slave server is Centos 6.3 64 bit & openldap 2.4.23 Everything was installed using yum. What is the proper method to synchronize database files from an ldap master server and slave server when the slave server is more recent then the master? I have followed the suggestion from 84104, but I am getting an error on the slave Here is the error on the slave: Oct 23 18:28:30 xe-ldap-slave1 slapd[1415]: slap_client_connect: URI=ldaps://ldap0.lan.myserver.com:636 DN="cn=syncuser,dc=myserver,dc=fr" ldap_sasl_bind_s failed (-1) Oct 23 18:28:30 xe-ldap-slave1 slapd[1415]: do_syncrepl: rid=003 rc -1 retrying Here is the error on the master: Oct 23 18:29:30 ldap0 slapd[15265]: conn=201 fd=35 ACCEPT from IP=192.168.150.100:47690 (IP=0.0.0.0:636) Oct 23 18:29:30 ldap0 slapd[15265]: conn=201 fd=35 closed (TLS negotiation failure) I can do an ldap search on the master just fine with the user configured for synchronization from the new slave server. ldapsearch -LLL -x -H ldaps://192.168.150.99:636 -x -W -b dc=myserver,dc=fr-D"cn=syncuser,dc=myserver,dc=fr"

    Read the article

  • Migrating to Windows Server 2008 R2 Domain Controllers - a few Questions/Issues

    - by Chris
    Ok so here's our setup: We have 2 Windows 2003 Domain Controllers. I am trying to replace them with Windows 2008 R2. The 2003 servers are named DC01 and DC02. The 2008 R2 servers are DC1 and DC2. I prepared the Windows Server 2003 Forest Schema for a Domain Controller that runs Windows Server 2008 or Windows Server 2008 R2. Then with both of the new servers up as member servers I ran dcpromo on DC1 using the advanced option and added it successfully to my existing domain. It's roles are GC, DNS and Active Directory Domain Services. I transferred The PDC Emulator, RID Pool Manager, and Infrastructure Master roles to DC1. The Schema Master and Domain Naming master are still on DC01. The first issue that I'm encountering is when I dcpromo the DC2 and select "Replicate data over the network from and existing domain controller" I select that I want to replicate from DC1 and I get the following error: Failed to identify the requested replica partner (dc1.xxx.org) as a valid domain controller with a machine account for (DC2$). This is likely due to either the machine account not being replicated to this domain controller because of replication latency or the domain controller not advertising the Active Directory Domain Services. Please consider retrying the operation with \dc01.xxx.org as the replica partner. "The server is unwilling to process the request. Is this because the Schema Master and Domain Naming Master roles are still on the old DC01? And if so, if I transfer Schema Master and Domain Naming Master roles to DC1 what is the risk or breaking my AD? I'm a little paranoid because this process HAS to be transparent. ANY down time or interruption will result in me getting a verbal ass kicking from my I.T. Director. Both of the new servers DNS point the the old DNS servers (DC01 and DC02) not themselves by the way.

    Read the article

  • Apache inflate application/ with mod_filter

    - by BGT
    I need to prevent pdf objects from being gzipped. Really, this only needs to take place if the request is from the Mozilla browser (but since I can't get something as seemingly simple as no-gzip for application/pdf, I figure it's wiser to start there). From looking at the apache documentation on mod_filter, I've got the following: <Location /> FilterDeclare gzipDeflate CONTENT_SET FilterDeclare gzipInflate CONTENT_SET FilterProvider gzipDeflate deflate req=User-Agent $Mozilla/ FilterProvider gzipInflate inflate resp=Content-Type $application/ FilterChain +gzipDeflate +gzipInflate </Location> From my testing, the gzipDeflate filter is doing its job and all the pages without the Content-Type starting with application are being gzipped. But, the gzipInflate doesn't seem to be working at all. I've inspected the response in Firebug and verified that the Content-Type being sent down is application/pdf. I'll go ahead and ask a potentially stupid question though: The response's Content-Type header in its entirety read "application/pdf; charset=Windows-1252". Does that make any sort of difference or is $application/ presumably enough to catch that? Any help is greatly appreciated. One other point, the URL that returns the pdf object does not have the .pdf extension. The pdf itself is stored in an Oracle database as a blob and appended to the page when appropriate (all the urls in the system use the same baseline). This was part of an original inquiry by a helpful member at stackoverflow who pointed me towards mod_filter and suggested I post the question here.

    Read the article

  • CQRS - Should a Command try to create a "complex" master-detail entity?

    - by Simon Crabtree
    I've been reading Greg Young and Udi Dahan's thoughts on Command Query Responsibilty Separation and a lot of what I read strikes a chord with me. My domain (we track vehicles which are doing deliveries) has the concept of a Route which contains one or more Stops. I need my customers to be able to set these up in our system by calling a webservice, and then be able to retrieve information about a Route and how the vehicle is progressing. In the past I would have "cut-down" DTO classes which closely resemble my domain classes, and the customer would create a RouteDto with an array of StopDto(s), and call our CreateRoute webmethod, passing in the RouteDto. When they query our system by calling the GetRouteDetails method, I would return exactly the same objects to them. One of the appealing aspects of CQRS is that the RouteDto might have all manner of properties that the customer wants to query, but have no business setting when they create a Route. So I create a separate CreateRouteRequest class which is passed in when calling the CreateRoute "command", and a Route DTO class which gets returned as a query result. class Route{ string Reference; List<Stop> Stops; } But I need my customer to provide me with Route AND Stop details when they create a route. As I see it I could either... Give my CreateRouteRequest class a Stops(s) property which is an array of "something" representing the data they need to provide about each stop - but what do I call this class? It's not a Stop as that's what I'm calling the list of DTO inside my Route DTO, but I don't like "CreateStopRequest". I also wonder if I'm stuck in a CRUD mindset here thinking in terms of master-detail information and asking the customer to think like that too. class CreateRouteRequest{ string Reference; ... List<CreateStopRequest> Stops; } or They call CreateRoute, and then make a number of calls to an AddStopToRoute method. This feels a bit more "behavioural" but I'm going to lose the ability to treat creating a route including its stops as a single atomic command. If they create a Route and then try to add a Stop which fails due to some validation problem they're going to have a partially correct Route. The fact that I can't come up with a good name for the list of "StopCreationData" objects I'd be working with in option 1, makes me wonder if there's something I'm missing.

    Read the article

  • New Validation Attributes in ASP.NET MVC 3 Future

    - by imran_ku07
         Introduction:             Validating user inputs is an very important step in collecting information from users because it helps you to prevent errors during processing data. Incomplete or improperly formatted user inputs will create lot of problems for your application. Fortunately, ASP.NET MVC 3 makes it very easy to validate most common input validations. ASP.NET MVC 3 includes Required, StringLength, Range, RegularExpression, Compare and Remote validation attributes for common input validation scenarios. These validation attributes validates most of your user inputs but still validation for Email, File Extension, Credit Card, URL, etc are missing. Fortunately, some of these validation attributes are available in ASP.NET MVC 3 Future. In this article, I will show you how to leverage Email, Url, CreditCard and FileExtensions validation attributes(which are available in ASP.NET MVC 3 Future) in ASP.NET MVC 3 application.       Description:             First of all you need to download ASP.NET MVC 3 RTM Source Code from here. Then extract all files in a folder. Then open MvcFutures project from mvc3-rtm-sources\mvc3\src\MvcFutures folder. Build the project. In case, if you get compile time error(s) then simply remove the reference of System.Web.WebPages and System.Web.Mvc assemblies and add the reference of System.Web.WebPages and System.Web.Mvc 3 assemblies again but from the .NET tab and then build the project again, it will create a Microsoft.Web.Mvc assembly inside mvc3-rtm-sources\mvc3\src\MvcFutures\obj\Debug folder. Now we can use Microsoft.Web.Mvc assembly inside our application.             Create a new ASP.NET MVC 3 application. For demonstration purpose, I will create a dummy model UserInformation. So create a new class file UserInformation.cs inside Model folder and add the following code,   public class UserInformation { [Required] public string Name { get; set; } [Required] [EmailAddress] public string Email { get; set; } [Required] [Url] public string Website { get; set; } [Required] [CreditCard] public string CreditCard { get; set; } [Required] [FileExtensions(Extensions = "jpg,jpeg")] public string Image { get; set; } }             Inside UserInformation class, I am using Email, Url, CreditCard and FileExtensions validation attributes which are defined in Microsoft.Web.Mvc assembly. By default FileExtensionsAttribute allows png, jpg, jpeg and gif extensions. You can override this by using Extensions property of FileExtensionsAttribute class.             Then just open(or create) HomeController.cs file and add the following code,   public class HomeController : Controller { public ActionResult Index() { return View(); } [HttpPost] public ActionResult Index(UserInformation u) { return View(); } }             Next just open(or create) Index view for Home controller and add the following code,  @model NewValidationAttributesinASPNETMVC3Future.Model.UserInformation @{ ViewBag.Title = "Index"; Layout = "~/Views/Shared/_Layout.cshtml"; } <h2>Index</h2> <script src="@Url.Content("~/Scripts/jquery.validate.min.js")" type="text/javascript"></script> <script src="@Url.Content("~/Scripts/jquery.validate.unobtrusive.min.js")" type="text/javascript"></script> @using (Html.BeginForm()) { @Html.ValidationSummary(true) <fieldset> <legend>UserInformation</legend> <div class="editor-label"> @Html.LabelFor(model => model.Name) </div> <div class="editor-field"> @Html.EditorFor(model => model.Name) @Html.ValidationMessageFor(model => model.Name) </div> <div class="editor-label"> @Html.LabelFor(model => model.Email) </div> <div class="editor-field"> @Html.EditorFor(model => model.Email) @Html.ValidationMessageFor(model => model.Email) </div> <div class="editor-label"> @Html.LabelFor(model => model.Website) </div> <div class="editor-field"> @Html.EditorFor(model => model.Website) @Html.ValidationMessageFor(model => model.Website) </div> <div class="editor-label"> @Html.LabelFor(model => model.CreditCard) </div> <div class="editor-field"> @Html.EditorFor(model => model.CreditCard) @Html.ValidationMessageFor(model => model.CreditCard) </div> <div class="editor-label"> @Html.LabelFor(model => model.Image) </div> <div class="editor-field"> @Html.EditorFor(model => model.Image) @Html.ValidationMessageFor(model => model.Image) </div> <p> <input type="submit" value="Save" /> </p> </fieldset> } <div> @Html.ActionLink("Back to List", "Index") </div>             Now just run your application. You will find that both client side and server side validation for the above validation attributes works smoothly.                      Summary:             Email, URL, Credit Card and File Extension input validations are very common. In this article, I showed you how you can validate these input validations into your application. I explained this with an example. I am also attaching a sample application which also includes Microsoft.Web.Mvc.dll. So you can add a reference of Microsoft.Web.Mvc assembly directly instead of doing any manual work. Hope you will enjoy this article too.   SyntaxHighlighter.all()

    Read the article

  • Java Day Tokyo 2014 ????????

    - by OTN-J Master
    5?22?(?)???????????????Java Day Tokyo 2014?????????????????Create the Future with Java??????Java?????????????????????Java???????????????????????????????????Java 8??Java 8????????(3?)??????????????????????????????????????? ¦ IoT?????Java-???????????????????????????Java?????????????????????????????? (????Nandini Ramani?Cameron Purdy?Simon Ritter)Nandini Ramani??Java SE?Java ME??Cameron Purdy?Java EE??Simon Ritter?Java????????????????????????????????????????????Java SE 8?????????????Oracle Java for MIPS(32???/64????)????????????????????????OTN?????????????????????????Java????????????????????????¦??????????????????IoT????????3????????????????????JavaOne 2013???????????????????????????????????????????????????????·Lego Mindstorms?????Stephen Chin(Java Technology Ambassador)???????????Lego Duke Segway????Lego Mindstorms??????????????????Duke???2????????????????????????????????????????????????????????????????????????????????????????????????????????? ??????????Duke Segway????????????????Duke?????????????????????????????????????????????? ·DukePadRaspberry Pi ??????Java SE Embedded?????????????????????????????????????JavaFX???????????????????????????????????????????????????????????(??????????????)????????????????????????????????????????????????????????????????????????? ·Chess RobotDukePad????????????????????????????????????DukePad??????????????????????????????????????????????????????????????¦Java???????2015??Java????20???????(?????????!)???Java?????????????????????????????????????????Java???????????????????????????????????????????????????????????????????????????????¦Java Magazine ????Java????????????????????????????????????????????????????????????Java 8 ?????????????????IoT???????????????????????????????????????????????????????????????????????????????????????????????Java????????????????????????????????????????????????????????????????????????Java Magazine????????????????????????????????????Java Magazine ????????OTN?????????Java????????????PDF??????eBook???????????????????? ??????????Facebook???:Java Day TokyoJava Day Tokyo 2013

    Read the article

  • ????????!Oracle Database 12c ??????????

    - by OTN-J Master
    ??????Oracle Database 12c ?????????????¦Oracle Database 12c ???????????????????????????????????c???cloud???????????????????????????????????????????????????????????????????????????????????????1????????????????????????????????????2????????????1????????????IT??????????????????????????????????????????????????????????1???????????????????????????????????????????????????Oracle Databae 12c ????????????????????????·??????????????????????????????????????????·??????????????????????????????????? Oracle Database 12c ?????? >>>>>>Oracle??????? Oracle Database 12c [????] >>>>>>OTN????? Oracle Database 12c [????] >>>>>>Oracle Database New Features Guide 12c Release1(??) [PDF?] [HTML?] ¦OTN???????????????????????????????????????????????Oracle Database 12c ?????????????????????????????????????????????????????????    1) ??????????????????     2) ??????(????)???????????    3) ???????????(??)?????????????    4) Oracle Databse 12c ????????????????????????????????????????????OTN?????????????Oracle Database 12c ??????????????????OTN?????????????????????OTN????RSS???OTN Twitter?????????????????????????????Oracle Database 12c???????????????

    Read the article

  • Insert Error:CREATE DATABASE permission denied in database 'master'. cannot attach the file

    - by user1300580
    i have a register page on my website I am creating and it saves the data entered by the user into a database however when I click the register button i am coming across the following error: Insert Error:CREATE DATABASE permission denied in database 'master'. Cannot attach the file 'C:\Users\MyName\Documents\MyName\Docs\Project\SJ\App_Data\SJ-Database.mdf' as database 'SJ-Database'. These are my connection strings: <connectionStrings> <add name="ApplicationServices" connectionString="data source=.\SQLEXPRESS;Integrated Security=SSPI;AttachDBFilename=|DataDirectory|\aspnetdb.mdf;User Instance=true" providerName="System.Data.SqlClient"/> <add name="MyConsString" connectionString="data source=.\SQLEXPRESS;Integrated Security=SSPI;AttachDBFilename=|DataDirectory|SJ-Database.mdf; Initial Catalog=SJ-Database; Integrated Security=SSPI;" providerName="System.Data.SqlClient" /> </connectionStrings> Register page code: using System; using System.Collections.Generic; using System.Linq; using System.Web; using System.Web.UI; using System.Web.UI.WebControls; using System.Data; using System.Data.SqlClient; public partial class About : System.Web.UI.Page { protected void Page_Load(object sender, EventArgs e) { } public string GetConnectionString() { //sets the connection string from your web config file "ConnString" is the name of your Connection String return System.Configuration.ConfigurationManager.ConnectionStrings["MyConsString"].ConnectionString; } private void ExecuteInsert(string name, string gender, string age, string address, string email) { SqlConnection conn = new SqlConnection(GetConnectionString()); string sql = "INSERT INTO Register (Name, Gender, Age, Address, Email) VALUES " + " (@Name,@Gender,@Age,@Address,@Email)"; try { conn.Open(); SqlCommand cmd = new SqlCommand(sql, conn); SqlParameter[] param = new SqlParameter[6]; //param[0] = new SqlParameter("@id", SqlDbType.Int, 20); param[0] = new SqlParameter("@Name", SqlDbType.VarChar, 50); param[1] = new SqlParameter("@Gender", SqlDbType.Char, 10); param[2] = new SqlParameter("@Age", SqlDbType.Int, 100); param[3] = new SqlParameter("@Address", SqlDbType.VarChar, 50); param[4] = new SqlParameter("@Email", SqlDbType.VarChar, 50); param[0].Value = name; param[1].Value = gender; param[2].Value = age; param[3].Value = address; param[4].Value = email; for (int i = 0; i < param.Length; i++) { cmd.Parameters.Add(param[i]); } cmd.CommandType = CommandType.Text; cmd.ExecuteNonQuery(); } catch (System.Data.SqlClient.SqlException ex) { string msg = "Insert Error:"; msg += ex.Message; throw new Exception(msg); } finally { conn.Close(); } } protected void cmdRegister_Click(object sender, EventArgs e) { if (txtRegEmail.Text == txtRegEmailCon.Text) { //call the method to execute insert to the database ExecuteInsert(txtRegName.Text, txtRegAge.Text, ddlRegGender.SelectedItem.Text, txtRegAddress.Text, txtRegEmail.Text); Response.Write("Record was successfully added!"); ClearControls(Page); } else { Response.Write("Email did not match"); txtRegEmail.Focus(); } } public static void ClearControls(Control Parent) { if (Parent is TextBox) { (Parent as TextBox).Text = string.Empty; } else { foreach (Control c in Parent.Controls) ClearControls(c); } } }

    Read the article

  • The Stub Proto: Not Just For Stub Objects Anymore

    - by user9154181
    One of the great pleasures of programming is to invent something for a narrow purpose, and then to realize that it is a general solution to a broader problem. In hindsight, these things seem perfectly natural and obvious. The stub proto area used to build the core Solaris consolidation has turned out to be one of those things. As discussed in an earlier article, the stub proto area was invented as part of the effort to use stub objects to build the core ON consolidation. Its purpose was merely as a place to hold stub objects. However, we keep finding other uses for it. It turns out that the stub proto should be more properly thought of as an auxiliary place to put things that we would like to put into the proto to help us build the product, but which we do not wish to package or deliver to the end user. Stub objects are one example, but private lint libraries, header files, archives, and relocatable objects, are all examples of things that might profitably go into the stub proto. Without a stub proto, these items were handled in a variety of ad hoc ways: If one part of the workspace needed private header files, libraries, or other such items, it might modify its Makefile to reach up and over to the place in the workspace where those things live and use them from there. There are several problems with this: Each component invents its own approach, meaning that programmers maintaining the system have to invest extra effort to understand what things mean. In the past, this has created makefile ghettos in which only the person who wrote the makefiles feels confident to modify them, while everyone else ignores them. This causes many difficulties and benefits no one. These interdependencies are not obvious to the make, utility, and can lead to races. They are not obvious to the human reader, who may therefore not realize that they exist, and break them. Our policy in ON is not to deliver files into the proto unless those files are intended to be packaged and delivered to the end user. However, sometimes non-shipping files were copied into the proto anyway, causing a different set of problems: It requires a long list of exceptions to silence our normal unused proto item error checking. In the past, we have accidentally shipped files that we did not intend to deliver to the end user. Mixing cruft with valuable items makes it hard to discern which is which. The stub proto area offers a convenient and robust solution. Files needed to build the workspace that are not delivered to the end user can instead be installed into the stub proto. No special exceptions or custom make rules are needed, and the intent is always clear. We are already accessing some private lint libraries and compilation symlinks in this manner. Ultimately, I'd like to see all of the files in the proto that have a packaging exception delivered to the stub proto instead, and for the elimination of all existing special case makefile rules. This would include shared objects, header files, and lint libraries. I don't expect this to happen overnight — it will be a long term case by case project, but the overall trend is clear. The Stub Proto, -z assert_deflib, And The End Of Accidental System Object Linking We recently used the stub proto to solve an annoying build issue that goes back to the earliest days of Solaris: How to ensure that we're linking to the OS bits we're building instead of to those from the running system. The Solaris product is made up of objects and files from a number of different consolidations, each of which is built separately from the others from an independent code base called a gate. The core Solaris OS consolidation is ON, which stands for "Operating System and Networking". You will frequently also see ON called the OSnet. There are consolidations for X11 graphics, the desktop environment, open source utilities, compilers and development tools, and many others. The collection of consolidations that make up Solaris is known as the "Wad Of Stuff", usually referred to simply as the WOS. None of these consolidations is self contained. Even the core ON consolidation has some dependencies on libraries that come from other consolidations. The build server used to build the OSnet must be running a relatively recent version of Solaris, which means that its objects will be very similar to the new ones being built. However, it is necessarily true that the build system objects will always be a little behind, and that incompatible differences may exist. The objects built by the OSnet link to other objects. Some of these dependencies come from the OSnet, while others come from other consolidations. The objects from other consolidations are provided by the standard library directories on the build system (/lib, /usr/lib). The objects from the OSnet itself are supposed to come from the proto areas in the workspace, and not from the build server. In order to achieve this, we make use of the -L command line option to the link-editor. The link-editor finds dependencies by looking in the directories specified by the caller using the -L command line option. If the desired dependency is not found in one of these locations, ld will then fall back to looking at the default locations (/lib, /usr/lib). In order to use OSnet objects from the workspace instead of the system, while still accessing non-OSnet objects from the system, our Makefiles set -L link-editor options that point at the workspace proto areas. In general, this works well and dependencies are found in the right places. However, there have always been failures: Building objects in the wrong order might mean that an OSnet dependency hasn't been built before an object that needs it. If so, the dependency will not be seen in the proto, and the link-editor will silently fall back to the one on the build server. Errors in the makefiles can wipe out the -L options that our top level makefiles establish to cause ld to look at the workspace proto first. In this case, all objects will be found on the build server. These failures were rarely if ever caught. As I mentioned earlier, the objects on the build server are generally quite close to the objects built in the workspace. If they offer compatible linking interfaces, then the objects that link to them will behave properly, and no issue will ever be seen. However, if they do not offer compatible linking interfaces, the failure modes can be puzzling and hard to pin down. Either way, there won't be a compile-time warning or error. The advent of the stub proto eliminated the first type of failure. With stub objects, there is no dependency ordering, and the necessary stub object dependency will always be in place for any OSnet object that needs it. However, makefile errors do still occur, and so, the second form of error was still possible. While working on the stub object project, we realized that the stub proto was also the key to solving the second form of failure caused by makefile errors: Due to the way we set the -L options to point at our workspace proto areas, any valid object from the OSnet should be found via a path specified by -L, and not from the default locations (/lib, /usr/lib). Any OSnet object found via the default locations means that we've linked to the build server, which is an error we'd like to catch. Non-OSnet objects don't exist in the proto areas, and so are found via the default paths. However, if we were to create a symlink in the stub proto pointing at each non-OSnet dependency that we require, then the non-OSnet objects would also be found via the paths specified by -L, and not from the link-editor defaults. Given the above, we should not find any dependency objects from the link-editor defaults. Any dependency found via the link-editor defaults means that we have a Makefile error, and that we are linking to the build server inappropriately. All we need to make use of this fact is a linker option to produce a warning when it happens. Although warnings are nice, we in the OSnet have a zero tolerance policy for build noise. The -z fatal-warnings option that was recently introduced with -z guidance can be used to turn the warnings into fatal build errors, forcing the programmer to fix them. This was too easy to resist. I integrated 7021198 ld option to warn when link accesses a library via default path PSARC/2011/068 ld -z assert-deflib option into snv_161 (February 2011), shortly after the stub proto was introduced into ON. This putback introduced the -z assert-deflib option to the link-editor: -z assert-deflib=[libname] Enables warning messages for libraries specified with the -l command line option that are found by examining the default search paths provided by the link-editor. If a libname value is provided, the default library warning feature is enabled, and the specified library is added to a list of libraries for which no warnings will be issued. Multiple -z assert-deflib options can be specified in order to specify multiple libraries for which warnings should not be issued. The libname value should be the name of the library file, as found by the link-editor, without any path components. For example, the following enables default library warnings, and excludes the standard C library. ld ... -z assert-deflib=libc.so ... -z assert-deflib is a specialized option, primarily of interest in build environments where multiple objects with the same name exist and tight control over the library used is required. If is not intended for general use. Note that the definition of -z assert-deflib allows for exceptions to be specified as arguments to the option. In general, the idea of using a symlink from the stub proto is superior because it does not clutter up the link command with a long list of objects. When building the OSnet, we usually use the plain from of -z deflib, and make symlinks for the non-OSnet dependencies. The exception to this are dependencies supplied by the compiler itself, which are usually found at whatever arbitrary location the compiler happens to be installed at. To handle these special cases, the command line version works better. Following the integration of the link-editor change, I made use of -z assert-deflib in OSnet builds with 7021896 Prevent OSnet from accidentally linking to build system which integrated into snv_162 (March 2011). Turning on -z assert-deflib exposed between 10 and 20 existing errors in our Makefiles, which were all fixed in the same putback. The errors we found in our Makefiles underscore how difficult they can be prevent without an automatic system in place to catch them. Conclusions The stub proto is proving to be a generally useful construct for ON builds that goes beyond serving as a place to hold stub objects. Although invented to hold stub objects, it has already allowed us to simplify a number of previously difficult situations in our makefiles and builds. I expect that we'll find uses for it beyond those described here as we go forward.

    Read the article

  • CodePlex Daily Summary for Thursday, June 10, 2010

    CodePlex Daily Summary for Thursday, June 10, 2010New Projectscab mgt: j mmmjk kjkjCAML Generator: CAMLBuilder makes it easier for generating CAML Query from code. It can be extensively used while Sharepoint Customization thru code. You no longer...Cloud Business Services: ISV Application Accelerator for business management of Cloud Applications build on Windows Azure or any hosting platform. Cloud Business Services ...Community Server 2.1 to WordPress WXL Exporter: This is a simple program to export all CS 2.1 posts and tags by a particular user to WordPress WXL format. DbIdiom for ADO.NET Core: DbIdiom is a set of idioms to use ADO.NET Core (without Dataset) easily.dotsoftRAID: This project is a software for using RAID on windows without special hardware ("Software RAID").DTSRun Job Runner: DTSJobRun makes it easier for SQL Server Developer to control DTS Job through 3rd party execution manager or process control or monitoring control ...Easy Share: Folder sharing is an indispensable part of our professional life. 'EasyShare' is a folder share creation, deletion and editing tool with integrated...elmah2: A project inspired by elmah (http://code.google.com/p/elmah/) The primary goals of this project are -> A plugin style architecture for logging ...Entropy: Entropy is a component for implementing undo/redo for object models. Entropy implements undo/redo with the memento pattern at the object level so t...eXtremecode Generator: eXtremecode generator is a code generator which makes it easier for asp.net developers to generate a well formed asp.net application by giving it j...GreenBean Script: GreenBean Script is a .NET port of the game-focussed scripting language, GameMonkey Script (http://www.somedude.net/gamemonkey) The first release ...IMAP POP3 SMTP Component for .NET C#, VB.NET and ASP.NET: The Ultimate Mail Component offers a comprehensive interface for sending, receiving e-mail messages from a server and managing your mailbox remotel...PRISM LayoutManager / WindowManager: A layout manager for PRISMSharePoint 2010 Taxonomy Import Utility: Build SharePoint 2010 taxonomies from XML. The SharePoint 2010 Taxonomy Import Utility allows taxonomy authors to define complete taxonomies in XML...SharePoint Geographic Data Visualizer: SharePoint Geographic Data Visualizer includes an Asp.Net server control and Microsoft Sharepoint web part which your users can use to visualize an...Silverlight Reporting: Silverlight Reporting is a simple report writer test bed for Silverlight 4+. The intent is to provide the basics of report writing while being flex...SSIS Expression Editor & Tester: An expression editing and testing tool for SQL Server Integration Services (SSIS). It also offers a reusable editor control for custom tasks or oth...STS Federation Metadata Editor: This is a federation metadata editor for Security Token Services (STS). STSs can be created on any platform (as long as it's based on the oasis sta...WebShopDiploma: WebShopDiploma is a sample application.WEI Share: WEI Share is an application for sharing your Windows Experience Index (WEI) scores from Windows 7 with others in the community. The data can be exp...New Releases.NET Transactional File Manager: 1.1.25: Initial CodePlex release. Code from Chinh's blog entry, plus bug fixes including one from "Mark".Active Directory Utils: Repldiag 2.0.3812.27900: Addressed a bug where lingering objects could not be cleaned due to unstable replication topologies resulting from the reanimation of lingering obj...Ajax ASP.Net Forum: developer.insecla.com-forum_v0.1.3: VERSION: 0.1.3 FEATURES Same as 0.1.2 with some bugs fixed: - Now the language/cultures DropDownList selectors works showing the related lang/cult...BigfootMVC: Development Environment Setup DNN 5.4.2: This is DNN development environment setup including the dabase. Does not include the TimeMaster / BigfootMVC code. BigfootMVC is in the source cont...CAML Generator: Version 1.0: Version 1.0 has been released. It is available for download. Version is stable and you can use it. If you have any concerns then please post issue...Community Forums NNTP bridge: Community Forums NNTP Bridge V35: Release of the Community Forums NNTP Bridge to access the social and anwsers MS forums with a single, open source NNTP bridge. This release has ad...Community Server 2.1 to WordPress WXL Exporter: CStoWXL_v1.0: First (and probably only) release. Exports posts and tags bases on supplied user name. Does not do comments or attachments. To run, do the followin...DbIdiom for ADO.NET Core: DBIdiom for ADO.NET Core 1.0: DBIdiom for ADO.NET Core 1.0DocX: DocX v1.0.0.9: ImportantA bug was found in Table.SetDirection() by Ehsan Shahshahani. I have fixed this bug and updated version 1.0.0.9 of DocX. I did not want to...Easy Share: EasyShare ver 1.0: Version 1.0 of Easy Folder Share toolEnterprise Library Extensions: Release 1.2: This release contains Windows Communication Foundation service behavior which makes it possible to resolve services using Unity either by applying ...Excel-Dna: Excel-Dna Version 0.26: This version adds initial support for the following: Ribbon support for Excel 2007 and 2010 and hierarchical CommandBars for pre-2007 versions. D...Exchange 2010 RBAC Editor (RBAC GUI) - updated on 6/09/2010: RBAC Editor 0.9.4.2: only small GUI fixes; rest of the code is almost same with version 0.9.4.1 Please use email address in About menu of the tool for any feedback and...eXtremecode Generator: eXtremecode Generator 10.6: Download eXtremecode Generator. Open Connections.config file from eXDG folder. Define required connections in Connections.config file. (multi...FAST for Sharepoint MOSS 2010 Query Tool: Version 1.1: Added Search String options to FQL Added options for Managed Property queryGiving a Presentation: RC 1.2: This new release includes the following bug fixes and improvements: Bug fix: programs not running when presentation starts are not started when pre...HKGolden Express: HKGoldenExpress (Build 201006100300): New features: User can add emotion icons when posting new message or reply to message. Bug fix: (None) Improvements: (None) Other changes: C...IMAP POP3 SMTP Component for .NET C#, VB.NET and ASP.NET: Build 519: Contains source code for IMAP WinForms Client, POP3 WinForms Client, and SMTP Send Mail Client. Setup package for the lib is also included.Liekhus ADO.NET Entity Data Model XAF Extensions: Version 1.1.1: Compiled the latest bits that took care of some of the open issues and bugs we have found thus far.manx: manx data 1.1: manx data 1.1 Updated manx data. Includes language and mirror table data.MEDILIG - MEDICAL LIFE GUARD: MEDILIG 20100325: Download latest release from Sourceforge at http://sourceforge.net/projects/mediligMiniCalendar Web Part: MiniCalendar WebPart v1.8.1: A small web part to display links to events stored in a list (or document library) in a mini calendar (in month view mode). It shows tooltips for t...Mytrip.Mvc: Mytrip.Mvc 1.0.43.0 beta: Mytrip.Mvc 1.0.43.0 beta web Web for install hosting System Requirements: NET 4.0, MSSQL 2008 or MySql (auto create table to database) Mytrip.Mv...NodeXL: Network Overview, Discovery and Exploration for Excel: NodeXL Excel 2007 Template, version 1.0.1.126: The NodeXL Excel 2007 template displays a network graph using edge and vertex lists stored in an Excel 2007 workbook. What's NewThis version fixes...Oxygen Smil Player: OxygenSmilPlayer 1.0.0.1a: Second Alpha UploadPowerAuras: PowerAuras V3.0.0K Beta1: New Auras: Item Name Equipment Slot TrackingPowerExt: v1.0 (Alpha 2): v1.0 (Alpha 2). PowerExt can display information such as assembly name, assembly version, public key etc in Explorer's File Properties dialog.Powershell Scripts for Admins: PowerBizTalk 1.0: BizTalk PowerShelll Module allows you to control : Action Component Start Stop Get Enlist Unenlist Remove Applications x...Refix - .NET dependency management: Refix v0.1.0.75 ALPHA: Latest updated version now supports a remote repository, an implementation of which is supplied as an ASP.NET MVC website.Resonance: TrainNode Service Beta: Train Node Service binary setup packageSCSM Incident SLA Management: SCSM Incident SLA Management Version 0.2: This is the first release of the SCSM SLA Management solution. It is an 'beta' release and has only been tested by the developers on the project. T...secs4net: Release 1.01: Remove System.Threading.dll(Rx included) dependence. SML releated function was move out.SharePoint 2010 Managed Metadata WebPart: Taxonomy WebPart 0.0.2: Applied fix to support Managed Metadata fields that allow multiple values.SharePoint 2010 Taxonomy Import Utility: TaxonomyBuilder Version 1: Initial ReleaseThis release includes full XML to Term Store import capabilities. See roadmap for more information. Please read release license prio...SharePoint Geographic Data Visualizer: Source Code: Source CodeSOAPI - StackOverflow API Parser/Wrapper Generator: SOAPI Beta 1: Beta 1 release. API parser/generator, JavaScript, C#/Silverlight wrapper libraries. Up to the hour current generated files can be found @ http://s...SSIS Expression Editor & Tester: Expression Editor and Tester: Initial release of expression editor tool and editor control. Download and extract the files to get started, no install required.STS Federation Metadata Editor: Version 0.1 - Initial release: This is the initial release of the editor. It contains all the basic functionallity but doesn't support multiple contact persons and multiple langu...SuperSocket: SuperSocket(0.0.1.53867): This release fixed some bugs and added a new easy sample. The source code of this release include: Source code of SuperSocket A remote process c...thinktecture Starter STS (Community Edition): StarterRP v1.1: Cleaned up version with identity delegation sample (in sync with StarterSTS v1.1)thinktecture Starter STS (Community Edition): StarterSTS v1.1: New stable version. Includes identity delegation support.VCC: Latest build, v2.1.30609.0: Automatic drop of latest buildWatermarker.NET: 0.1.3812: Stability fixWouter's SharePoint Demo Land: Navigation Service with WCF Proxy: A SharePoint 2010 Service Application that uses WCF service proxies to relay commands to the actual service.Xna.Extend: Xna.Extend V1.0: This is the first stable release of the Xna.Extend Library. Source code and Dynamic Link Libraries (DLLs) are available with documentation. This ve...Yet Another GPS: YaGPS Beta 1: Beta 1 Release Fix Installer Default Folder Fix Sound Language Folder Problem Fix SIP Keyboard Focus Error Add Arabic Sound Language Add Fr...Most Popular ProjectsWBFS ManagerRawrAJAX Control ToolkitMicrosoft SQL Server Product Samples: DatabaseSilverlight ToolkitWindows Presentation Foundation (WPF)patterns & practices – Enterprise LibraryPHPExcelMicrosoft SQL Server Community & SamplesASP.NETMost Active ProjectsCommunity Forums NNTP bridgepatterns & practices – Enterprise LibraryjQuery Library for SharePoint Web ServicesRhyduino - Arduino and Managed CodeBlogEngine.NETNB_Store - Free DotNetNuke Ecommerce Catalog ModuleMediaCoder.NETAndrew's XNA Helperssmark C# LibraryRawr

    Read the article

  • Extra blank page when converting HTML to PDF using abcPDF.

    - by ProfK
    I have an HTML report, with each print page contained by a <div class="page">. The page class is defined as width: 180mm; height: 250mm; page-break-after: always; background-position: centre top; background-image: url(Images/MainBanner.png); background-repeat: no-repeat; padding-top: 30mm; After making a few changes to my report content, when I call abcPDF to convert the report to PDF, suddenly I'm getting a blank page inserted after every real report page. I don't want to roll back the changes I've just made to remove this problem, so I'm hoping someone may know why the extra pages are being inserted.

    Read the article

  • In Eclipse RCP, how do I disable a save toolbar button according to the "dirty" property in editor

    - by paulgreg
    In my eclipse RCP 3.3 application, I would like to enable or disable a 'save' toolbar button according to current editor dirty flag. I'm trying to use the <enabledWhen tag but I can't make it work. Here's the portion of code in plugin.xml : <command commandId="org.acme.command.save" icon="icons/save.png" id="org.acme.command.save" style="push"> <enabledWhen> <instanceof value="activeEditor"/> <test property="dirty" value="true"/> </enabledWhen> </command> Do you have any idea how that is supposed to work ?

    Read the article

  • Can FPDF/FPDI use a PDF in landscape format as a template?

    - by Jim OHalloran
    I am trying to import an existing PDF as a template with FPDI. The template is in landscape format. If I import the template into a new document the template page is inserted in portrait form with the content rotated 90 degrees. If my new document is in portrait the full content appears, but if the new document is also landscape, the content is cropped. Is it possible to use a landscape template with FPDI? Thanks in advance! Jim.

    Read the article

  • VS2008 CR report viewer. Print and Export to PDF is not working

    - by S M Kamran
    I've a web project in VS2008. The problem is that from a web report viewer; report is not getting printed or exported to pdf. Report is being shown alright but when print button or export button is pressed, nothing happens. No errors or crash. Just nothing got happened. Default printer is set alright and I am able to print from that machine.. Am I missing some thing here??? Earlier when I installed the application in a new virtual directory report was not getting shown then I've copied aspnet_Client folder in my newly created web application root and the report was then made visible. However the print and export functionality is not working.

    Read the article

  • What is the best XPS WYSIWYG Editor that will output XAML?

    - by Simon
    I am looking for a WYSIWYG for XPS that will ouput the raw XAML. My intentions is use the tool for a way to quickly generate and test the XAML (of XPS documents) with the intention of then generating the same xaml in code and programmatically generating an XPS document. The same approach that many people use for generating html. Can anyone recommend any of the below products or suggest any others? Better yet does anyone know of a free XPS editor? NiXPS Edit $300USD http://www.nixps.com/nixps_edit_20.html XamlPad is free but also quite limited and does not really target XPS. Office System Power Tools is not WYSIWYG http://blogs.msdn.com/adrianford/archive/2008/07/03/visual-studio-and-xps-files.aspx XPS eXaminer 1.0 $399USD http://storefront.qualitylogic.com/p-16-qualitylogic-xps-examiner.aspx

    Read the article

< Previous Page | 132 133 134 135 136 137 138 139 140 141 142 143  | Next Page >