# Do You Wish Lightroom Was Quicker? Adobe Does Too



## Canon Rumors Guy (Jul 11, 2017)

```
According to Adobe, they are putting a priority on speeding up the the performance of Lightroom. The sluggish performance for some people using Lightroom has been a consistent concern for quite some time now.</p>
<p><strong>Adobe on Lightroom Performance:</strong></p>

<blockquote><p>I would like to address concerns recently voiced by our community of customers around Lightroom performance, as improving performance is our current top priority. We have a history, starting with our first public beta, of working with our customers to address workflow and feature needs, and we’d like to take that same approach regarding your performance concerns. We already understand many of the current pain points around GPU, import performance, certain editing tasks and review workflows and are investing heavily in improving those areas.  Over the past year we’ve added numerous enhancements to address your performance concerns but we understand we will have a lot of work to do to meet your expectations.   If you have feedback or would like to work with the Lightroom team on your most pressing issues, please <a href="https://www.surveymonkey.com/r/LrDesktop_performance" target="_blank" rel="noopener">fill out this survey</a>.</p></blockquote>
<p>Adobe has apparently heard its customers and everyone should continue sending in concerns about Lightroom’s performance issues.</p>
<span id="pty_trigger"></span>
<div style="font-size:0px;height:0px;line-height:0px;margin:0;padding:0;clear:both"></div>
```


----------



## jebrady03 (Jul 11, 2017)

left a comment on their post that I was BLOWN AWAY that they consider the outpouring of vitriol to be "recent". I mean... c'mon!!!!!!!!


----------



## CanonCams (Jul 11, 2017)

Their HDR merge is a God awful long process.


----------



## nonac (Jul 11, 2017)

I can't stand how long it takes to edit photos. They need an interface that lets you quickly cull images like photo mechanic. I shoot thousands of sports images and I'm so glad I found photo mechanic for the initial culling. It is blazing fast. With that said, I would much rather have this ability in one program so I don't have to then transfer what's left over to Lightroom for editing. Once there of course, the editing really slows things down. I hope you fix it soon or I'll be looking elsewhere. I sometimes have to edit and send things at halftime to my media outlet during a game which only provides a few minutes to cull and edit.


----------



## YuengLinger (Jul 11, 2017)

Not sure this is the place to post. Adobe sites!

I gave up years ago using LR for initial review and culling. DPP is still warp factors faster for that.

For edits, I don't have much to compare to except PS and plug-ins. PS is super quick. Plug-ins are a bit sluggish compared to LR.

We'll see if this is just empty hype or sincere. Within a decade.


----------



## melbournite (Jul 11, 2017)

This is why I’m still holding on to Aperture with one hand on Capture One. Aperture (which is seven years old) still runs faster than any software I have tried and more intuitive to boot. I wonder why they can’t use the same architecture?

I know I can’t hold on to Aperture forever, (the Mk4 raws don’t work with it) but I will hold on till the end, and/or till some miracle happens, i.e. Lightroom has an overhaul, or Photos gets a Pro Plugin?


----------



## jthomson (Jul 11, 2017)

The very existence of photo mechanic should give them a great big clue ;D
They need a browse/cull mode where you can look at the embedded JPEGs in the raw files to select discards quickly.


----------



## rwvaughn (Jul 11, 2017)

The catalog system is bloated and a resource hog.


----------



## AvTvM (Jul 11, 2017)

rwvaughn said:


> The catalog system is bloated and a resource hog.



exactly! that database stuff needs to go. OS could take care of metadata and keywords directly.


----------



## Mikehit (Jul 11, 2017)

AvTvM said:


> rwvaughn said:
> 
> 
> > The catalog system is bloated and a resource hog.
> ...



Metadata and keywords are instrumental to the database. 
And if the metadata and keywords are in the OS, how would you transfer things between machines?


----------



## docfrance (Jul 12, 2017)

Then there's AfterShot Pro that is blazingly fast and effective.


----------



## Silverstream (Jul 12, 2017)

My biggest wish for Lightroom as they expand Lightroom mobile(web) to make it a better interface for clients to select which images they want following a shoot. 
We need to be able to add a watermark. 
We need clients to be able to sort and filter their selections to make it easier for them to whittle down suggestions. 
We need them to store the comments made through the web interface actually in the local catalog.
We need to be able to receive clear notifications and sort to see what selections the client has made along with their notes.


----------



## Mt Spokane Photography (Jul 12, 2017)

I have no issues at all as to speed, but some users seem to have issues, often with home built computers using high end components. I learned the hard way that building my own pc is not only expensive, because I used high end components, but more importantly, the components did not always give good performance with some software where as plain components worked well. ACDSEE is a different story, it struggles to run nearly as fast as lightroom or photoshop using my common Dell XPS PC.

Software can always be improved.


----------



## Otara (Jul 12, 2017)

Totally agree on better culling tool needed.


----------



## Chaitanya (Jul 12, 2017)

Along with performance improvements(stability more than speed) we would also like to see new version of standalone lightroom.


----------



## AvTvM (Jul 12, 2017)

Mikehit said:


> AvTvM said:
> 
> 
> > rwvaughn said:
> ...



For me, metadata belongs right into the header of the respective image file, not into a big fat database. Writing/Reading/Searching metadata in file headers is something every reasonable OS can do .. natively. Quite well and very fast as a matter of fact. I never understood, why Adobe felt the need to duplicate file organization with its weirdo database/catalogue, rather than letting the OS do that job. 

I would really love a new, SLIM version of LR ... like DPP in terms of raw converter plus all the local adjustment and editing possibilities of LR. Plus included editor for EXIF/ITPC/keyword data. But without that bloated big fat database. No "importing", no catalogue, no exporting". Simply "open", "edit" and "save / save as [e.g. jpg] ..." - as with any other program/App I care to use. 

If Canon DPP had perspective/keystone correction (as in LR) and *local adjustments*, not only global ones - I would say goodbye to Adobe today.


----------



## Zv (Jul 12, 2017)

Saw this on Reddit before CR this morning and filled out the survey. I used to love LR but now I despise using it which is putting a downer on my whole creative experience. I just don't have the motivation to go through even 100 pics one by one when LR Library module is so slow. It takes forever and I can't be bothered using another program to do that. I want to do it all in one place - that's what made LR great to begin with. Now it's just bloated with crap. They should concentrate on Library, Develop and Print modules only and throw out the rest. Seriously considering ditching it but I have over 5 years of catalogs which makes me think twice. It's about time they addressed this issue.


----------



## ethanz (Jul 12, 2017)

Serious question, why not just use Bridge and Photoshop? Its fast and I would assume has almost all the features of Lightroom.


----------



## privatebydesign (Jul 12, 2017)

ethanz said:


> Serious question, why not just use Bridge and Photoshop? Its fast and I would assume has almost all the features of Lightroom.



Bridge and ACR combined are more powerful, have more features and are faster than LR.


----------



## tpatana (Jul 12, 2017)

privatebydesign said:


> ethanz said:
> 
> 
> > Serious question, why not just use Bridge and Photoshop? Its fast and I would assume has almost all the features of Lightroom.
> ...



Like my typical shoot I'm doing some ~100 pic crop in a row, how you do that on Bridge/ACR? Aside from the slow-down, LR is perfect for my use. I wonder if I complained often enough about that so they invited me to the public beta couple weeks ago


----------



## mahdi_mak2000 (Jul 12, 2017)

I hope they make a better raw converter for new canon bodies. 
I have to use rubbish DPP for conversion


----------



## Orangutan (Jul 12, 2017)

AvTvM said:


> rwvaughn said:
> 
> 
> > The catalog system is bloated and a resource hog.
> ...


The filesystem of the OS is a database. If the LR database is slow then it's a defect in their particular implementation of a database, not in the concept of a database. Personally, I think database for metadata is the correct way to do it. I detest the fact that DPP wants to embed changes in the original file: every time a file is opened for writing there is a chance of corruption. Raw files should be marked as read-only by the OS, and all edits/metadata stored in a database.


----------



## Talys (Jul 12, 2017)

Orangutan said:


> AvTvM said:
> 
> 
> > rwvaughn said:
> ...



I couldn't agree more. There is nothing flawed with the concept of a database at all. You're trading space (which is almost never a constraint) for speed, especially for indexed search. If a database is slow, it's a crappy database system, or more likely, a fine database system with a crappy implementation.


----------



## rwvaughn (Jul 12, 2017)

AvTvM said:


> rwvaughn said:
> 
> 
> > The catalog system is bloated and a resource hog.
> ...



If PhotoMechanic had white balance and basic editing ability (tone/contrast/sharpening) I'd have no need for Lightroom at all.


----------



## Talys (Jul 12, 2017)

tpatana said:


> privatebydesign said:
> 
> 
> > ethanz said:
> ...




My preference is ACR as well. However, I never need to do a hundred pic crops in a row. If I did have to do hundreds in a batch, damn straight I'd use Lightroom  Not to mention, if you are doing X-Rite color correction on a hundred photos, Lightroom would be much easier.


----------



## niels123 (Jul 12, 2017)

Mt Spokane Photography said:


> I have no issues at all as to speed, but some users seem to have issues, often with home built computers using high end components. I learned the hard way that building my own pc is not only expensive, because I used high end components, but more importantly, the components did not always give good performance with some software where as plain components worked well. ACDSEE is a different story, it struggles to run nearly as fast as lightroom or photoshop using my common Dell XPS PC.
> 
> Software can always be improved.



Which components in a home-built computer would be conflicting? Graphics cards are usually compatible with the motherboard and windows etc. If there's issues there, then it's just the card which is not properly supported. Motherboard/ CPU/ RAM are such "basic" components that I can't believe they cause compatibility issues


----------



## Mikehit (Jul 12, 2017)

AvTvM said:


> For me, metadata belongs right into the header of the respective image file, not into a big fat database. Writing/Reading/Searching metadata in file headers is something every reasonable OS can do .. natively. Quite well and very fast as a matter of fact. I never understood, why Adobe felt the need to duplicate file organization with its weirdo database/catalogue, rather than letting the OS do that job.
> 
> I would really love a new, SLIM version of LR ... like DPP in terms of raw converter plus all the local adjustment and editing possibilities of LR. Plus included editor for EXIF/ITPC/keyword data. But without that bloated big fat database. No "importing", no catalogue, no exporting". Simply "open", "edit" and "save / save as [e.g. jpg] ..." - as with any other program/App I care to use.
> 
> If Canon DPP had perspective/keystone correction (as in LR) and *local adjustments*, not only global ones - I would say goodbye to Adobe today.



LR was originally designed as a digital rights management prgraom to which the catalog was instrumental. Everything you say is 'I don't need lightroom'. So don't use it. 

Many professionals use Photomechanic or Breezebrowser as their culling tool and LR as their primary editing tool. It sounds here like your approach to program is like your approach to cameras - any one that cannot do everything you want it to do is crap. Live with it.


----------



## AvTvM (Jul 12, 2017)

i am fully aware of lr history. do not need or use it as an image inventoey management system. windows and my carefully chosen image naming scheme takes perfectly care of that. i started with raw shooter as raw converter and when it was bought by adobe only to take it from the market i went with the free license of lr 1.0 ...

i actually do like lightroom ... but solely *as a raw processor and photo editing software*. it is still the only app that does all i want and need and has an acceptable user interface (as opposed to ps).

if canon dpp was just a bit enhanced in its "editing capabilities" - especially perspective correction and local adjustments/edits - not evenbto dream of things like "content aware fill" - i would prefer and use it. but unfortunately canon does not seem interesting or willing/able to offer this. btw: yes i would be willing to pay for it - eternal license, 100 € every 3 years, no problem. 

acr + photoshop is no solution for ne, since ps is even more bloated than lr and ps user interface gives me the creeps. i bought cs5 only to wipe it from my pc after about an hour. totally un-usable. no, i will never buy books or watch nerd videos on youtube in order to "learn" how to operate software. apps that want a chance with me, need to come with a self-explanatory user interface = decent menus, no "alt-shift-ctrl" keyboard "shortcuts", pure and straightforward workflow oriented, mouse + click. the closer it is to ms office UI, the better.

all other software i have tried so far that "in principal offers what i need" - excellent raw conversion plus some light editing without explicit levels - suffers either from being "too dumbed down" (looking at PS Elements) and/or being "too much/too expensive/too user-unfriendly". unfortunately.

currently i am on lr 5.7 and dont have major performance issues, despite a catalogue with 200.000 images in it. but it is very bloated and all the files that lr creates - catalogues, backups, previews, xmps and a whole slew of other stuff - some of it in obschtrly hidden folders - have grown in total to almost the same size as the image data (raws and jpgs) themselves. which i consider ludicrous. 

as soon as i buy a new camera, lr 5.7 will no longer support it. i will not trbt adobe cc. i will not buy lr 6 since it is old already and does not contain full functionality. there will probably be no lr 7 as permanent license. if so, adobe will soon lose me as a customer. even when it means i will lose my "edit recipes" and capability to go back and change something ... for 200.000 images. unfortunately there is still no satisfactory app in sight as a replacement. they all are either "raw converter" only (but for canon raws nothing beats canon dpp) or they are too "ps-like / pixel manipulation" centric (affinity etc) ... which is not what i want/need either. 

major dilemma. and really hate adobe for it. why no "lr lite" without database/catalogue and without "cc rental / force-bubdled with useless for me ps"?


----------



## LDS (Jul 12, 2017)

AvTvM said:


> For me, metadata belongs right into the header of the respective image file, not into a big fat database. Writing/Reading/Searching metadata in file headers is something every reasonable OS can do .. natively. Quite well and very fast as a matter of fact.



No, actually it can't, unless it "indexes" the files, which means it uses a database as well to store files metadata so it can search for them quickly. The difference is that the LR database is optimized for image files (unlike an OS one, which has to cope with many more types), and it is exportable so you can move it to a different/newer machine. It's also fairly easy to add the required data to a database, but it could be impossible to modify an OS to handle data it doesn't support natively - using sidecar files would just slow things down.

BTW, there are increasing concerns about how actual file systems can cope with the increasing sizes of disks, and one of the solutions attempted is to make them more database-like.



AvTvM said:


> I never understood, why Adobe felt the need to duplicate file organization with its weirdo database/catalogue, rather than letting the OS do that job.



LR doesn't duplicate the file organization - and that's a big plus. My file organization is by date, because it simplify backups and archiving - but a database lets me access files under several different "organizations" as needed.


----------



## AvTvM (Jul 12, 2017)

LDS said:


> No, actually it can't, unless it "indexes" the files, which means it uses a database as well to store files metadata so it can search for them quickly. The difference is that the LR database is optimized for image files (unlike an OS one, which has to cope with many more types), and it is exportable so you can move it to a different/newer machine. It's also fairly easy to add the required data to a database, but it could be impossible to modify an OS to handle data it doesn't support natively - using sidecar files would just slow things down.



LR doesn't duplicate the file organization - and that's a big plus. My file organization is by date, because it simplify backups and archiving - but a database lets me access files under several different "organizations" as needed.
[/quote]

not needed in 2017. Windows takes care of it all. I use only 2 apps with a database that duplicates, what windows could do on its own: MS Outlook and Adobe LR. Both suck .. because of their big, fat, clumsy database. 

Image tagging is not needed any longer. Enough apps that do automatic tagging by image content and face recognition. Rest: metadata into file header, everything handled by OS (and its index database).


----------



## LDS (Jul 12, 2017)

privatebydesign said:


> Bridge and ACR combined are more powerful, have more features and are faster than LR.



Actually, ACR and LR, AFAIK, share the same RAWand image editing engine, just with slightly different UIs built on top of it. Thereby, performance are the same.

Just to get ACR you need Photoshop - and you'll need the latter (or another application) also for output sharpening and printing, because you can't do it from ACR.

I find LR print proofing and printing far easier than in Photoshop, although the latter may yield better results, but the effort required is higher.


----------



## AvTvM (Jul 12, 2017)

LDS said:


> privatebydesign said:
> 
> 
> > Bridge and ACR combined are more powerful, have more features and are faster than LR.
> ...




the UI is not "slightly different", but "night and day" different. 
PS = absolutely unusable for people like me [unwilling to "learn" software] . 
LR = quite fine for people like me [people wanting a intuitive UI] 

I have not many issues with Adobe RAW engine per se. Problem is elsewhere: I want a *limited, but still powerful set of "simple photo editing" options, to be applied directly on RAW. 

meaning: 
* good perspective correction [like LR] 
* plus simple, "no-levels-needed" local edits/adjustments [like LR] 
* things like "intelligent content aware fill" etc. also welcome
* big bloated database = not needed, not wanted 

Current options:
* Bridge+PS = way too complex for what i need. 
* PS Elements = too dumb, does not offer what I need 
* LR has editing functionality, but only in rental/CC version - not in LR 6 ... ... and only with that bloated database 
* DPP = only Canon Raws, good raw converter, only global adjustments, no reasonable perspective correction
* other RAW converters: suffer from exactly the same
* other photo editing software: either dumb as Elements and/or "instagram/art filter orgy" or overly complex

THAT's the problem

PS: of course that's only me. I know that many others mileage will vary. But i also know, there are many "photo enthusiasts" looking for exactly the same thing I am after: something like *Lightroom lite* ... sans database.


----------



## LDS (Jul 12, 2017)

AvTvM said:


> not needed in 2017. Windows takes care of it all. I use only 2 apps with a database that duplicates, what windows could do on its own: MS Outlook and Adobe LR. Both suck .. because of their big, fat, clumsy database.



Frankly, I find the Windows indexer quite heavy, and for examples Samsung advises to disable it on SSDs because it just wears the disk more because of the heavy rewrite cycles at every file modification. It also doesn't handle a lot of EXIF data, only a subset of them, and the search UI is quite ugly.

Nor you can transport catalogs, which is something I do quite often when I work on my Surface while traveling, and then move everything into the main catalog upon my return - without losing any data.

Outlook does use the Windows OS file indexer to search for mailbox contents locally - it doesn't use its own database (it doesn't work if it is disabled). Thereby you just asserted the Windows search function sucks.



AvTvM said:


> Image tagging is not needed any longer. Enough apps that do automatic tagging by image content and face recognition. Rest: metadata into file header, everything handled by OS (and its index database).



A "file header" could not contain all the metadata you may want to add, when the file format is outside your control. Unless you encapsulate the original one inside another one, and you get another proprietary format to manage (like DNGs). Or you have to use sidecar files.

File systems that stores data in some kind of "external attributes" may easily lose them when file are copied to file systems which don't understand them. It is a quite common issue when you copy data from NTFS or HFS/APFS to FAT disks or backup systems. With LR you can move a catalog from a Windows system to a Mac one through a FAT disk, or restore from a backup, and you don't lose anything.

The OS database will be a big mess because it won't contain only the image files, but all the indexed files on disk - with broadly different requirements. OS developers won't care about specific needs, they will fill the most common needs, they won't index all those IPTC fields.

I tag images the way I like, not the way some apps believe I have to tag them - and they won't of course know specific personal tags I wish to add. Face recognition is quite useless when you have a lot of unknown people probably photographed only once.


----------



## AvTvM (Jul 12, 2017)

LDS said:


> The OS database will be a big mess because it won't contain only the image files, but all the indexed files on disk - with broadly different requirements. OS developers won't care about specific needs, they will fill the most common needs, they won't index all those IPTC fields.



Well, my experience is different. Windows search works like a charm now. Automatic tagging is coming right along now. 

I want my image files handled exactly like my Excel worksheets, my Word documents, my Powerpoint presentatuions. One file, contains all necessary information. Good naming scheme, orderly folder structure. 

I hate MS Outlook. I would like to have 1 folder each with files in it - 1x contacs, 1x calender events/tasks. Every appointment a separate file. Every contact a separate file. App just needs to link them. .pst database is a big PITA. Exactly the same as LR databse. Not needed for what I do.

That's why I want a LR lite. Regular LR can continue, no problem. But for me: "Lite, without database", please. 

Pixmantec RAW Shooter was pretty perfect for me [at its time].


----------



## Mikehit (Jul 12, 2017)

AvTvM said:


> One file, contains all necessary information. Good naming scheme, orderly folder structure.



The whole point of LR is you don't need a file structure. It greatly simplifies searching and cross-referencing.



AvTvM said:


> That's why I want a LR lite. Regular LR can continue, no problem. But for me: "Lite, without database", please.



And if they did that it wouldn't be LR would it!! it would be a different system with a different name...something like...Photomechanic maybe. That would be a good name for it.


----------



## zim (Jul 12, 2017)

Mikehit said:


> And if they did that it wouldn't be LR would it!! it would be a different system with a different name...something like...Photomechanic maybe. That would be a good name for it.



His problems are deeper than that, no-one will make the camera of his dreams, no-one makes the software of his dreams... what a nightmare, I'd take up a new hobby!


----------



## Orangutan (Jul 12, 2017)

AvTvM said:


> I hate MS Outlook. I would like to have 1 folder each with files in it - 1x contacs, 1x calender events/tasks. Every appointment a separate file. Every contact a separate file.


Welcome to 1997! Outlook to me is unusable without good indexing/search features -- I have over a 100k items in my Outlook, and still need access to the oldest.



> That's why I want a LR lite. Regular LR can continue, no problem. But for me: "Lite, without database", please.


Again, you're just asking for a more effective database, rather than no database at all.

The problem is that Adobe's lead "engineers" are clueless about what people actually want in product behavior. They've stumbled along the path to a product with a lot of power, but also a lot of thorns, and they're hoping no one catches up. They need to take some of that new subscription cash that's burning holes in their pockets and put it into fixing these types of problems.


----------



## AvTvM (Jul 12, 2017)

Orangutan said:


> AvTvM said:
> 
> 
> > I hate MS Outlook. I would like to have 1 folder each with files in it - 1x contacs, 1x calender events/tasks. Every appointment a separate file. Every contact a separate file.
> ...



i have 100.000+ text files/documents from 25 years and 200.000+ images back to 2000. i have no problem finding any of them. windows even can search into content of text documents. it is indexed already by the os. windows (or other os) "file management database" is all that's needed. thats why any file format has a header with space for metadata. applications should not "double up" with an extra layer of database on top of it.


----------



## privatebydesign (Jul 12, 2017)

LDS said:


> privatebydesign said:
> 
> 
> > Bridge and ACR combined are more powerful, have more features and are faster than LR.
> ...



Performance is not the same as speed. Bridge+ACR can open pretty much any image file so for an imaging professional it is a much more efficient tool, efficiency ends up equating to time.

How many .PSB files do you have? LR can't see them, it is primarily a 'simple' photo program that can't open a lot of 'photo' files.


----------



## LDS (Jul 12, 2017)

AvTvM said:


> thats why any file format has a header with space for metadata.



Actually, it's far more complex than you think. Some file formats have no headers at all, other store metadata in far more complex structures than simple headers. It may be impossible to add new metadata without breaking the format - especially with formats like RAW files, which are designed for camera write speed, not versatility, and may not be fully documented. 

It may also create issues the one LR creates when storing changes inside DNGs - which means incremental backups need to backup the new DNG (possibly several MBs) too.



AvTvM said:


> applications should not "double up" with an extra layer of database on top of it.



LR features like collections, virtual and proof copies would be much harder to implement using only files. You may think links and a file containing a list of files would do - but think what happens when you delete/move an image: you would need to check what links becomes orphaned (very few file systems do it automatically), and look in every collection to delete/move any reference. A (relational) database will do that automatically (it's called "referential integrity").

You may be surprised that a lot of applications, even on smartphones, put a database layer over the data they manage. Where do you believe, for example, WhatsApp stores its chats? Thunderbird stores mail in files, but just like LR it uses a SQLite database to index them.

Moreover applications that need to run on different platforms prefer to avoid to have to cope with the nuisances of the underlying OS when they can. There are differences in how the Apple and Windows file systems work, and how an application can interact with them. Using a common database on each platform, ensures the application is not subject to the whims of the platform - and its functionalities are the same.


----------



## Diko (Jul 12, 2017)

LDS said:


> It may also create issues the one LR creates when storing changes inside DNGs - which means incremental backups need to backup the new DNG (possibly several MBs) too.



LR does NOT change anything in DNG AIFAK. Unless there's a certain setting of which I am not aware.

Otherwise my take:

Adobe, really? And when do you intend on releasing LR 7?!? December?

I mean everyone knows that the SQLite is its biggest champ and bigest bug. 
Everyone knows that C1 gets sharper images than LR.
Everyone knows that LR can't possibly utilize more than 8 CPU cores (Logical, not necessarily physical).

Why now?

I did everything - DNG conversion (parallel computing, instead the regular RAW files), more RAM ([email protected]), on SSD, with 1:1 preview on a 6700K. And still can't get faster importing and DNG conversion, still need to wait when using Spot removal or Local brush.

And that is on regular 30MP files.... Don't ask me when I go the 50MPs path :/


----------



## mcschlotz (Jul 12, 2017)

The complaints regarding performance have already been well documented over the past year(s). Adobe's competition is available to all to review including them! Don't get me wrong, I'm glad to see they are finally talking about it openly and supposedly have made it their top priority, but given their long avoidance of this issue, it's hard not to be pretty skeptical regarding their sincerity going forward. 

What would be more convincing? If they were to indicate they have put a moratorium on any additional changes until the performance issues have been resolved AND going forward that no enhancements will be deployed if it negatively affects the corrected performance measurements.

JMTC
Matt


----------



## LDS (Jul 12, 2017)

privatebydesign said:


> Performance is not the same as speed. Bridge+ACR can open pretty much any image file so for an imaging professional it is a much more efficient tool, efficiency ends up equating to time.
> How many .PSB files do you have? LR can't see them, it is primarily a 'simple' photo program that can't open a lot of 'photo' files.



Does ACR open PSB files? If so, the fact LR can't it's just an artificial limit by Adobe (although LR don't support all the PS features and have more stricter limit on the max image size).

If you mostly edit your photos in PS it's surely simpler to use ACR than using LR and export them to PS - just, the RAW image processing is the same - which also makes PR and LR inter-operable.

LR was never designed to be a full replacement for Photoshop, especially as long as it is sold separately at a cheaper price. IMHO, unless forced by competition, LR standalone won't support layers. If it is only sold along PS in a CC subscription, there will be less commercial interest in neutering some advanced features.

It is true it was designed to be a "simple" application for digital cameras workflows - remember the first versions? - and probably has far outgrown its roots. It was built on some readily available tools (ACR RAW engine, the Lua engine, SQLite database), and probably some changes are needed to improve speed.

Anyway, there's a lot of value in non-destructive and multi-image editing, which makes it complementary to ACR/PS, depending on your needs.


----------



## LDS (Jul 12, 2017)

Diko said:


> LR does NOT change anything in DNG AIFAK. Unless there's a certain setting of which I am not aware.



Yes, there is a setting you're not aware of  It's under "Catalog settings", "Include develop settings in metadata inside...". When checked, the develop settings too are recorded inside the file (otherwise only metadata are) - only for file formats that can support that, of course. The format is XMP. 

It's a nice feature if you need to exchange files between LR instances without a full catalog or adding the sidecar file, but it also means the files are modified.

If it's good or bad depend on your workflow.


----------



## jeffa4444 (Jul 12, 2017)

AvTvM said:


> LDS said:
> 
> 
> > privatebydesign said:
> ...



I don't use DPP, import everything into LR CC either on a brand new top of the range MacBook Pro with touch bar, or on an iMac, then do more complex corrections in PS CC. 

LR definitely runs slower than PS and loading shots on the iMac is painfully slow as are any complex adjustments to update. Its much faster on the MacBook Pro but the machine is more than twice the price of the iMac with newer faster processor and more powerful GPU & double the memory, its been slow LR for at least three years. Would 100% agree LR is intuitive to use unlike PS which is overly complex and illogical hence Ive learnt just about enough but there are features I'm not using because they are way too complex to learn & life is too short to do so (Apple why did you blow out Aperture anyone could use it and Adobe could learn lessons there). 
If Adobe has woken up to the resource issues its about time its software hogs resources and again Apple proved you can retain complexity whilst at the same time conserving resources (until you count in iCloud local disc space!). 

Frankly if a really good competitor stepped up to the plate Adobe would lose many customers


----------



## Mikehit (Jul 12, 2017)

I like LR for its intuitive, quick edits.

I am using PS as my pixel editor, and I have tried OnOne but prefer the PS interface so see no reason to switch. 
I would have no objection to using Photomechanic or Breezebrowser to cull and do basic cataloging but because LR comes bundled with PS I would probably still use LR for simple adjustments.


----------



## Khalai (Jul 12, 2017)

Diko said:


> LDS said:
> 
> 
> > It may also create issues the one LR creates when storing changes inside DNGs - which means incremental backups need to backup the new DNG (possibly several MBs) too.
> ...



I feel the pain. i7-5820K @ 4.2 GHz, 32 GB RAM @ 2400 MHz, Samsung 950 Pro SSD for OS/applications and Samsung 850 EVO for photo storage. And Lightroom still can get sluggish or even unusable sometimes. And that should not happen on a PC of this performance potential...


----------



## grainier (Jul 12, 2017)

I do not care. I do not have budget or desire to endlessly upgrade my PC, so I gave up on LR when it was still v. 4


----------



## SteveM (Jul 12, 2017)

I'm appalled it has taken this long to address an issue which I seem to have been reading about for years. I doubt very much this is being done for no reason, I wonder if this is the opening salvo in a soon to be released Lightroom 7?


----------



## BillB (Jul 12, 2017)

SteveM said:


> I'm appalled it has taken this long to address an issue which I seem to have been reading about for years. I doubt very much this is being done for no reason, I wonder if this is the opening salvo in a soon to be released Lightroom 7?



Don't know whether it will be Lightroom 7, but my guess is that they already have something cooking. When you admit that you have a serious problem that needs fixing, it's a good idea to be able to pull a rabbit out of the hat pretty quickly. To some extent, LR has the problems of success. A lot of people use it for a lot of different things on a lot of different equipment. Not surprising that there are some pretty unhappy people out there. The good news for Adobe may be that the people that are most unhappy with Lightroom may not have any alternatives that work any better for them.


----------



## tpatana (Jul 12, 2017)

Diko said:


> Everyone knows that LR can't possibly utilize more than 8 CPU cores (Logical, not necessarily physical).



Do you mean cores or threads? I've had 8c/16ht i7 for couple years, and I'm pretty sure it draws all 16 threads.


----------



## Diko (Jul 12, 2017)

Khalai said:


> I feel the pain. i7-5820K @ 4.2 GHz, 32 GB RAM @ 2400 MHz, Samsung 950 Pro SSD for OS/applications and Samsung 850 EVO for photo storage. And Lightroom still can get sluggish or even unusable sometimes. And that should not happen on a PC of this performance potential...



And you know what pisses me of the most? The fact that I am entering every time the same info on their feedback form that appears when LR is opened now and then. 

That means they DO have all the info and stats back their HQ and all that "monkey survey" seems like refined PR to the hardcore users like us. 

I DID adapt all the way, but didn't received the service they were looking for :/ _But now_ they are listening.... WTF?


----------



## ethanz (Jul 12, 2017)

tpatana said:


> privatebydesign said:
> 
> 
> > ethanz said:
> ...



I can do batch crops in ACR.


----------



## tpatana (Jul 12, 2017)

ethanz said:


> tpatana said:
> 
> 
> > privatebydesign said:
> ...



Yes, if you don't need them individually cropped. Which I need.


----------



## Diko (Jul 12, 2017)

tpatana said:


> Diko said:
> 
> 
> > Everyone knows that LR can't possibly utilize more than 8 CPU cores (Logical, not necessarily physical).
> ...


 Cores, not threads. Check the graph below. ;-) 

But indeed I was mistaken for one fact. 4 Cores, not 8 



BillB said:


> SteveM said:
> 
> 
> > I'm appalled it has taken this long to address an issue which I seem to have been reading about for years. I doubt very much this is being done for no reason, I wonder if this is the opening salvo in a soon to be released Lightroom 7?
> ...


I doubt LR 7 would see things improved enough. 

IMHO it is time they stop adding features but work on the core (read DEVELOPMENT + LIBRARY modules) performance improvements. Please read below 1/, 2/ & 3/ for detailed argumentation why CORE engines rewriting from scratch is needed in order to achieve the required performance boost expected and awaited in the near future.

*1/ **Get rid of the SQLITE* and perhaps... write something of their own. 

SQLITE is a database engine and is what reads and writes for each pixel modification. It is the heart of the non-destructiveness. It is open-source (if I recall correctly) and is meant for in-program usage on a local machine just as LR is doing. However...

As a start back in the days (2006-2007-uish) with the 10-15 Megapixels back then maybe it seemed quite alright. But with 100 megapixels on the horizon anytime in the next 2-5 years available for the FF DSLRs/MILCs not a chance SQLite to be enough usable. I am unsure what kind of logic or database should be expected to utilize such tremendous dataflows, but we DO already have 100mp Medium Format DSLRs. So they'd better think of some novelty! Perhaps more intense utilization of RISC GPUs logic could be of any help to them. Don't know if doable. Not that low-level code write myself. But this non-destructiveness takes its toll and it has to be overcome while writing each pixel's virtual modification. After all it is either in RAM or on the SSD in a parallel file to read/write on.

*2/* *CPU handling*

You can check a few tests like *this one* or this *major conclusion* which states that in most scenarios even more than four physical cores are not a factor for better performance: 







And now we are entering in the era of mandatory 8 cores |16 Threads. What will Adobe make of it depends on the core engine complete rewriting from scratch.

*3/* *GPU utilization* - almost NONE. 

Bugs instead. In the new era of Direct X 12 and Vulkan - where by both enough control is provided for code developers we are awaiting of the miracle to happen. Currently my GPU is almost unused as everybody else's. It seemed to be more like a Marketing trickery the implementation of that GPU feature instead of a true boost...

What do you think? This Monkey Junkey survey for me is still kind of PR... nothing more. Nothing less ;-)


----------



## Batman6794 (Jul 13, 2017)

AvTvM said:


> the UI is not "slightly different", but "night and day" different.
> PS = absolutely unusable for people like me [unwilling to "learn" software] .
> LR = quite fine for people like me [people wanting a intuitive UI]
> 
> ...



Capture One Pro is not on your list and probably should be. I also bailed on lightroom because it doesn't support my current cameras and I won't rent software. I was mad at Adobe for a while, now I'm thankful they drove me to Capture One. 

Try the 30 day trial, nothing to lose.


----------



## TomDibble (Jul 13, 2017)

AvTvM said:


> Mikehit said:
> 
> 
> > AvTvM said:
> ...



Generally I don't agree, as a software guy. Yes, OSs have metadata management and searching databases (ex, Spotlight on OS X), but you are then using a very general-purpose database which is not tuned towards the types of searches you are doing instead of a special-purpose database indexed on the specific fields you care about. If performance is important to you, the special-purpose database will generally win out every time.

That said, that data *does* belong in the sidecar (not in the original raw file headers, as I'd rather absolutely limit the number of edits to those original files to zero if possible), and I really wish that were the default in Lightroom or even a setting we could set globally instead of on each catalog we create (I separate various projects with separate catalogs, keeping everything self-contained).

Also, with regards to metadata headers within (some) file formats: this is practical for small file sizes, but the larger the file size the less efficient having frequently-updated metadata in the file becomes. Adding a new header generally means rewriting the whole file on disk, so to put a 20 character title on a 32MB image, you are writing 32,000,000 bytes of data instead of just 20. If you want metadata associated with files "transparently" then using file system-level metadata is a great solution (just a small amount of data written to disk instead of the whole file), although it is limited in what it will store and if you move your files around between file systems you will lose metadata which isn't fully supported in the destination (or any hop along the way). The .XMP sidecar file format is dreadful (as all XML-based formats are), but better than an embedded header. Also, the same .XMP format works no matter what the original image is (CR2, JPEG, DNG, TIF, PNG, etc), which allows that IO code to be as highly optimized as possible.


----------



## pwp (Jul 13, 2017)

Diko said:


> And you know what pisses me of the most? The fact that I am entering every time the same info on their feedback form that appears when LR is opened now and then.
> 
> That means they DO have all the info and stats back their HQ and all that "monkey survey" seems like refined PR to the hardcore users like us.
> 
> I DID adapt all the way, but didn't received the service they were looking for :/ _But now_ they are listening.... WTF?



You're right of course, the survey is certainly a PR exercise. They've known the facts for years. But at least there is a glimmer of hope. As a daily LR user all the way from the first public beta, I've seen it get slower and slower with each version. I'd say the last genuinely quick one was V4. It's become a real love/hate relationship. I love what it purports to deliver, it's UI, features and so on. But the speed humps that have been explained ad-nauseum here and on other forums have driven me to very reluctantly abandon LR-CC for C1 Pro which I'm still learning the finer points of, but appreciating it's rapid performance. 

My business simply couldn't afford the punishing slowdowns with LR when there are 100's of images to process and a same-day or next-day deadline. Adobe used to be the good guys. Times change. ???

-pw


----------



## AvTvM (Jul 13, 2017)

@TomDibble: understand the points you make and agree with many of them.

but ... 

typically (my) raw image files are worked on only once. tagging on import, plus image editing (wb, contrast, colours, lens and perspective corrections, local adjustments, etc). only in very few cases do i go back later on and change some or all of these edits to create another version of "output" file (currently .jpg). 

yes, by all means i would like to have that possibility and non-destructive edits.

but if the "edit recipes" plus all metadata were stored directly in the raw file header necessitating the entire file to be re-written would be no real performance issue, since it haüpens so very rarely. it may be more frequent for a few users, but i am convinced that vast majority of all images imported into a raw converter/image editor like LR are edited only once. 

the database - both the very concept and the weak implementation - are the root cause of LR's performance issues. getting rid of the entire database concept while retaining LR/ACR raw engine, LR UI and editing capabilities (limited to photo-centric tasks, but powerful) would be the "LR lite" software i would like to get.

image tagging will be done by OS pretty soon - face and motive recognition. i am sure in windows 11 i can type in the search field "eiffel tower, rita" and it will quickly list all my images of "rita in front of or somewhere up on eiffel tower". add GPS data in every image of newer cameras ... and there will be absolutely no more need to manually tag my images in an app like LR or any other image editor. 

same for exif and iptc data - any os should allow immediate access abd edits - including batch - for that data (stored in inage file header). 

again: that duplicate LR database layer is not needed for my use (and presumably that of most LR users).


----------



## AvTvM (Jul 13, 2017)

@batman6794: re C1 Pro .. i have looked at it. don't like the user interface. especially that i need to handle explicit "LAYERS" just to make any (small) local adjustments ... i very much prefer LR concept and approach to edits: no layers, no explicit nasks and the like. select the tool, stamp, clone, gradient, whatever ... and DIRECTLY apply to desired image area, done! 

LR concept of "no layers, but chronology stack of actions" is the major reason for the apps success! simple, clear, intuitive and "more than enough" for all of my edits. i do not need the depth/pixel level of PS style edits for my images. i also never do "multiple capture stuff" ... no composite images, no stitching/panos, no focus stacking, no HDRs. not my style, not needed here. just like "no time lapse, no video, no moving images" either. others do. i dont. many others also do not. 

Canon DPP, Adobe PS elements etc. are "too little" for my needs ... whereas PS or C1Pro or Affinity etc. are "too much" for what i need. LR editing features are "right there". if adobe offers me a - permanent license! - blazingly fast performing "LR 7 lite" version - sans database/catalogue - i will buy instantly.


----------



## LDS (Jul 13, 2017)

Diko said:


> *1/ **Get rid of the SQLITE* and perhaps... write something of their own.



I hope they'll avoid to reinvent the wheel because it takes a lot of time, and a recipe for disaster. LR doesn't store "pixel modifications" in the database - AFAIK it stores "actions" which are computed and applied to the original image to display the actual one. That allows LR to be "non-destructive", and apply the modifications in the "best" order regardless of the order the user selected them. The data stored into the database or sidecar files are small.

SQLite may or may be not one of the bottlenecks, depending on how it is used. Single image editing should be the less impacting situation.

Yet computing changes for large images quickly is surely a challenge, which will require deep changes in the way LR works. Doing work in parallel is one way, the issue is when one modification needs the previous one to be applied before computation can occur. An image could be "split" into slices to be computed in parallel - the tricky part are the boundaries, and coordinating everything because there's a shared display area. Bringing a GPU in also requires deep changes.

Adobe can do it, but I don't expect a "revolutionary" LR release, IMHO Adobe will introduce changes to tackle what customers feel is more important. It has telemetry, but may not say what users "feel" slow - regardless of the actual numbers i.e. saving a few tens of second when a brush is applied may make the application more responsive than saving several seconds while importing or generating a panorama.


----------



## bitm2007 (Jul 13, 2017)

LR's speed isn't an issue for me. I'd prefer it they put the time and effort into development mode improvements , Adobe it's now the latter half of 2017, and we are still expected to use Lightroom CC 2015. Pull your finger out guy's.


----------



## Mikehit (Jul 13, 2017)

bitm2007 said:


> LR's speed isn't an issue for me. I'd prefer it they put the time and effort into development mode improvements , Adobe it's now the latter half of 2017, and we are still expected to use Lightroom CC 2015. Pull your finger out guy's.



They are two different programs for two different purposes. 
What do you want LR to do?


----------



## BillB (Jul 13, 2017)

AvTvM said:


> @batman6794: re C1 Pro .. i have looked at it. don't like the user interface. especially that i need to handle explicit "LAYERS" just to make any (small) local adjustments ... i very much prefer LR concept and approach to edits: no layers, no explicit nasks and the like. select the tool, stamp, clone, gradient, whatever ... and DIRECTLY apply to desired image area, done!
> 
> LR concept of "no layers, but chronology stack of actions" is the major reason for the apps success! simple, clear, intuitive and "more than enough" for all of my edits. i do not need the depth/pixel level of PS style edits for my images. i also never do "multiple capture stuff" ... no composite images, no stitching/panos, no focus stacking, no HDRs. not my style, not needed here. just like "no time lapse, no video, no moving images" either. others do. i dont. many others also do not.
> 
> Canon DPP, Adobe PS elements etc. are "too little" for my needs ... whereas PS or C1Pro or Affinity etc. are "too much" for what i need. LR editing features are "right there". if adobe offers me a - permanent license! - blazingly fast performing "LR 7 lite" version - sans database/catalogue - i will buy instantly.



Reminds me of "Goldilocks and the Three Bears". This porridge is too hot! This porridge is to cold! Can Adobe put together a Lightroom that is just right for Baby Bear by getting rid of the database and putting everything in file headers? Don't think they can meet my needs without sidecar files. Would a Lightroom that allowed us to tailor functionality be nice? Of course.


----------



## Khalai (Jul 13, 2017)

BillB said:


> AvTvM said:
> 
> 
> > @batman6794: re C1 Pro .. i have looked at it. don't like the user interface. especially that i need to handle explicit "LAYERS" just to make any (small) local adjustments ... i very much prefer LR concept and approach to edits: no layers, no explicit nasks and the like. select the tool, stamp, clone, gradient, whatever ... and DIRECTLY apply to desired image area, done!
> ...



I'm happy with Lightroom catalogue philosophy, I'm very happy with UI, I'm quite happy with image output. I'm very UNHAPPY about its speed, especially considering having high-end PC doesn't mean squat for LR. Give me performance boost, keep LR philosophy as is and I'll be happy camper (as long as they offer LR as a standalone product and not CC only I might add).


----------



## ronaldbyram (Jul 13, 2017)

AvTvM said:


> rwvaughn said:
> 
> 
> > The catalog system is bloated and a resource hog.
> ...



They need to do something with it.. Anything I hope can be an improvement. my catalog on the C: SSD is almost full. (Guess my mistake for putting it on the home Drive.)
They need a better option during install to spread the files apart (Different drives) if possible.


----------



## bitm2007 (Jul 13, 2017)

Mikehit said:


> bitm2007 said:
> 
> 
> > LR's speed isn't an issue for me. I'd prefer it they put the time and effort into development mode improvements , Adobe it's now the latter half of 2017, and we are still expected to use Lightroom CC 2015. Pull your finger out guy's.
> ...



Two different aspects of the same program, surely !

Don't get me wrong, Lightroom is an excellent all round piece of software, but it's the jack of all trades, the class leading master of very few, so there is plenty of room for improvement.


----------



## markesc (Jul 14, 2017)

I thought I would gain some speed by upgrading computers... ha!

Old computer: 2012 Dell xps 8500 intel i7 3770, 8gb ram, pointless amd video card, samsung 830 ssd

New computer: AMD Ryzen 7 1700, overclocked to 3.7ghz on all 8 cores. Evga gtx 1080ti, Samsung 960pro, two extra samsung 850's, one of which I use as a dedicated scratch disk. 16gb ram.

Difference: I'd say about 10-15% at best with LR.

Whether I enable or disable gpu useage, the thing just gets slower and slower after about the 50th photo, so I have to close/reopen... good times!


----------



## stevelee (Jul 14, 2017)

I talked with some pros who use Lightroom all the time and asked them about the advantages over Photoshop and when and why I might want to use it. The answers they gave were largely in the "time is money" category, and I could see why they used it, but didn't get any reason why I might. I don't often have that many pictures shot under the same lighting conditions, so my rare batches are just a handful of pictures, easily done in Filmstrip mode of Bridge. And I don't feel the need for the database. Just having pictures made on the same day put into the same folder on the computer is organization enough for me. Then if there are multiple days in one project or trip, I'll put those folders into an enclosing folder topically named.

Scott Kelby flat-out tells people to use Lightroom instead of Bridge and Photoshop, but doesn't much elaborate.


----------



## Otara (Jul 14, 2017)

For me its that you arent changing the underlying images. Means I can keep the original and still edit it, rather than having to keep track of both original and editted versions.

Also having built in options for upload to flickr etc makes life easy, built in for making books too is nice. One stop shop basically, unless you're doing very complex editting.


----------



## stevelee (Jul 14, 2017)

Otara said:


> For me its that you arent changing the underlying images. Means I can keep the original and still edit it, rather than having to keep track of both original and editted versions.
> 
> Also having built in options for upload to flickr etc makes life easy, built in for making books too is nice. One stop shop basically, unless you're doing very complex editting.



Using Bridge and Photoshop, I keep the original RAW file and the generated .xmp file that is automatically place in the same folder, which allows me to revisit the RAW file with my edits still in place to tweak if I wish. For me that pretty much accomplishes the same things that Lightroom does in a way that seems to make more sense to me. I can just move the folder wherever I wish without dealing with a database issue. I know you can move stuff in LR, too. I just haven't bothered to learn how. 

I might also save a .psd file if I like my edits and think I might want to revisit them or use the file as a basis for a version to print. But I don't bother is all I want is a JPEG to put on the web. Also, if I do HDR or panoramas in Bridge, it creates a .dng file I can come back to and change the ACR settings.

I don't have any trouble keeping up with all that myself. Everything winds up in the same folder except the JPEGs that go in the folder for its place in the website. For the latter, I normally keep the same file name as the original .CR2 with just the .jpg extension instead, so I always know where to find the original if I decide I want to do a better web version.

So while I understand why LR works better for a lot of other people, and I understand that even folks who are doing about the same workflow I do might still prefer to use LR, I don't see any advantage for me to switch. I'm going to wind up taking almost everything into Photoshop eventually anyway, out of habit even if not out of necessity.

But, yes, thanks. I still am interested in reasons why LR works well for people. Perhaps some day I'll have a project that I think will go better for me in LR if I know how other people use it and benefit from it.


----------



## ethanz (Jul 14, 2017)

stevelee said:


> Otara said:
> 
> 
> > For me its that you arent changing the underlying images. Means I can keep the original and still edit it, rather than having to keep track of both original and editted versions.
> ...



I'm in the same boat as you. I've sometimes thought about trying Lightroom to see what it's like. No idea why people use it over PS.


----------



## AvTvM (Jul 14, 2017)

> I've sometimes thought about trying Lightroom to see what it's like. No idea why people use it over PS.


[/quote]

simple: ps user interface is atrocious for newcomers. lr user interface lets anybody work away immediately, once you overcome the initial hurdle of having to "import image files into the database", rather than just "opening files". for light editing work, lr is a really good tool, but the database and associated performance issues are a problem.


----------



## tpatana (Jul 14, 2017)

AvTvM said:


> > I've sometimes thought about trying Lightroom to see what it's like. No idea why people use it over PS.



simple: ps user interface is atrocious to newcomers. lr user interface lets anybody work away immediately, once you overcome the initial hurdle of having to "import image files into the database", rather than bust "opening" them. for light editing work, lr is a really good tool. only the database and associated performance issues are a problem.
[/quote]

That, and half my shoots I'm dealing with 2k+ images and edit 100+ of them. LR saves lives.


----------



## SteveM (Jul 14, 2017)

That, and half my shoots I'm dealing with 2k+ images and edit 100+ of them. LR saves lives.
[/quote]

Give 'Photo Mechanic' a try. Blazingly fast on the 'cull' and you can then open them directly into Lightroom, Capture One, PS, and keywording is also excellent.
I personally use it to cull from a folder on my hard drive, allocating 1 star for those to be deleted which I do through Photo Mechanic in one batch at the end of my analysis, this leaves me with the photos I wish to keep.


----------



## Diko (Jul 14, 2017)

stevelee said:


> I talked with some pros who use Lightroom all the time and asked them about the advantages over Photoshop and when and why I might want to use it. The answers they gave were largely in the "time is money" category, and I could see why they used it, but didn't get any reason why I might. I don't often have that many pictures shot under the same lighting conditions, so my rare batches are just a handful of pictures, easily done in Filmstrip mode of Bridge. And I don't feel the need for the database. Just having pictures made on the same day put into the same folder on the computer is organization enough for me. Then if there are multiple days in one project or trip, I'll put those folders into an enclosing folder topically named.
> 
> Scott Kelby flat-out tells people to use Lightroom instead of Bridge and Photoshop, but doesn't much elaborate.



Mostly it is about wokrflow. I don't understand why in "Adobe asks for LR performance feedback" topic actually people began discussing LRvsPS+Bridbge - bad AvTvM. ;-)


Actually Lr has always been meant to be used as:

1/ Culluing processor (most colleagues who cull on pc instead of doing mostly on the camera, however these days avoid doing it in LR, since it's way too slow for this kind of job).
2/ Catalogue
3/ Developing environment
4/ Publishing tool.

Now having in mind that Camera RAW was invented just for 3/ doesn't make sense. When you add bridge which is amazing if you collaborate with people (don't remember: does it have a server part for LAN workflow?) or within few programs like InDesign or AI it is the right tool for the job.

Now if you shoot event (photos shot are usually in the hundreds), or you prefer to tether-shoot, or just seeing the results prior to last culling with applied corrections on a big screen - LR was supposed to be the tool ;-) 

So the whole discussion LR vs PS+Bridbge is stupid. Each of us has his/her own workflow - ergo different tools for his/her job.


----------



## tpatana (Jul 14, 2017)

For me culling on LR is fast enough, but the pain is when I go through edits one by one. Especially the crop/rotate tool takes ages to activate. If I restart LR the first 10-20 are fine, but then it slows down again. That's my #1 fix-wish for the next LR version.


----------



## ethanz (Jul 15, 2017)

SteveM said:


> That, and half my shoots I'm dealing with 2k+ images and edit 100+ of them. LR saves lives.



Give 'Photo Mechanic' a try. Blazingly fast on the 'cull' and you can then open them directly into Lightroom, Capture One, PS, and keywording is also excellent.
I personally use it to cull from a folder on my hard drive, allocating 1 star for those to be deleted which I do through Photo Mechanic in one batch at the end of my analysis, this leaves me with the photos I wish to keep.
[/quote]

That's how I 'cull' through my images, make them one star in bridge. Then edit all one stars in ps. Not sure how that is any different from lr or how lr saves time. I also have 2-3000 images to go through. I've been using ps for over ten years so I'm used to the interface. I wouldn't call it atrocious. I guess I just don't know what else is out there.


----------



## Diko (Jul 15, 2017)

ethanz said:


> SteveM said:
> 
> 
> > ...That, and half my shoots I'm dealing with 2k+ images and edit 100+ of them. LR saves lives...
> ...



For culling I use _*FastRaw viewer*_. It really counts ;-) Because as they say: "_To Choose Correctly You Need to See a Correct Preview_".


----------



## EdelweissPirate (Jul 15, 2017)

AvTvM said:


> for light editing work, lr is a really good tool, but the database and associated performance issues are a problem.



Others have attempted to convey this to you without success, so maybe I'm tilting at windmills here:

The association between Lightroom's database and its performance issues exists only in your imagination. It's a mistake to claim that SQLite is an _a priori_ cause of Lightroom's performance issues.

All of the fixes you suggest, especially storing all the metadata in the image headers, would be both slower, uglier and more failure-prone than Lightroom's SQLite approach. I'm not trying to insult you here, but your understanding of programming/computer science is not quite what you seem to think it is.

There's no shame in having an incomplete understanding of these subjects, but you might at least consider that people who do this for a living (including other posters on this forum and the Lightroom developers) have thought about these things and rejected them for good reason.

SQLite (or something very much like it) is the right tool for this particular job, though I'd have preferred a true multi-user database to allow catalogs to be stored on networked drives. Others have already explained why OSes are not databases and why Spotlight and the Windows search/indexing function would be worse than Adobe's use of SQLite. One thing I didn't see mentioned is that if you rely on the OS to build its own database for your photos, switching between Windows and OSX (either way) becomes enormously problematic.

Why would you want to lock yourself into a particular OS simply because your photo editing app relies on that OS's database? Additionally, because your "ideal" OS-database would be entirely different for Mac vs. Windows, Adobe would have to spend time and money developing hooks for two entirely independent database structures. Oh, and if those OS-based databases don't support exactly the same metadata, then your feature set diverges on the two platforms. Speaking generally about DAM software, you _really_ want a database, and you _really_ want it to be completely independent of the OS.

AvTvM, you don't seem to reject the idea of Lightroom using a database, but rather you seem to think that Adobe has chosen the wrong one. I'm going to go out on a limb here and speculate that you think that somehow SQLite is a "big," ponderous database, so you pin all of Lightroom's performance issues on SQLite. SQLite isn't a slow database. It's efficient and fast for small data sets, and no matter how big you think your catalog is, it's not a big data set in the grand scheme of things.

If you don't think that SQLite is a big, slow database, then what's your objection to it?

That's not to say that Lightroom doesn't have issues. It does, but I don't pretend to know the root cause of those problems. For example, as far as I can tell, exporting JPEGs and building 1:1 previews (which are basically the same thing) should be embarrassingly parallel or nearly so. Yet, as the plot Diko posted shows, Lightroom doesn't efficiently use more than four CPUs for these tasks. In other words, building 1:1 previews should take half as much time with eight physical cores as it does with four, but instead it takes nearly the same amount of time.

There's a reason for that, and I don't know what it is. But I doubt that it's because all of the Lightroom developers know less about parallel computing than I do. (I don't know very much).

I'd love to see better performance from Lightroom. I agree that Adobe hasn't taken users' performance complaints seriously, and I agree that there are other workflows that have advantages over Lightroom. But pinning all of your complaints on SQLite only highlights your incomplete understanding of the issues at hand.


----------



## Khalai (Jul 15, 2017)

EdelweissPirate said:


> That's not to say that Lightroom doesn't have issues. It does, but I don't pretend to know the root cause of those problems. For example, as far as I can tell, exporting JPEGs and building 1:1 previews (which are basically the same thing) should be embarrassingly parallel or nearly so. Yet, as the plot Diko posted shows, Lightroom doesn't efficiently use more than four CPUs for these tasks. In other words, building 1:1 previews should take half as much time with eight physical cores as it does with four, but instead it takes nearly the same amount of time.



According to Puget Systems, exporting images is about the only thing which goes as much parallel as it can, yielding .97 efficiency, while other tasks are much more problematic.


ActionParallel Efficiency (1 is perfect)Importing images from USB.00Exporting images to disk.97Convert from RAW to DNG.69Generate 1:1 Previews.77Generate Smart Previews.51Create HDR image.60Create Panorama image.44Facial Recognition.20


----------



## EdelweissPirate (Jul 15, 2017)

Yeah, not so much. Puget Systems is calculating not the theoretical parallel efficiency of each task but rather the efficiency of Lightroom as you add processors. Some tasks are inherently parallel while others are inherently serial. 

Moreover, Puget Systems is calculating the efficiency only at the first few processors. None of those curves shows Puget Systems' calculated efficiency for long for long. When I say that exporting images and generating previews should be embarrassingly parallel, I mean that the time for the task should fall continuously with the number of processors. 

All of the Puget Systems curves flatten out after somewhere between 2-8 cores, depending on the task. I'm suggesting that for generating 1:1 previews and exporting images, the line should have (theoretically) a quadratically decreasing slope.

Exporting images doesn't "go as much parallel as it can," in your words. The speedup is quasi-linear but then drops off dramatically. Their task takes 200 seconds with three cores and about 100 seconds with six cores. So far, so good. But then twelve cores should take 50 seconds, and their plot makes it look more like 90 seconds. So the parallel efficiently may be 0.97 for the first few processors, but it drops off to almost zero after that.

Realistically, the speedup in Puget Systems' export test may be limited by disk or memory bandwidth, which is understandable. But generating 1:1 previews is (insofar as I understand it) should be at least as parallelizable (so to speak), and yet Lightroom doesn't reflect that. This might be an area where speeding up Lightroom is straightforward, but I don't think we know enough about the problem to say for sure.


----------



## Khalai (Jul 15, 2017)

EdelweissPirate said:


> Yeah, not so much. Puget Systems is calculating not the theoretical parallel efficiency of each task but rather the efficiency of Lightroom as you add processors. Some tasks are inherently parallel while others are inherently serial.
> 
> Moreover, Puget Systems is calculating the efficiency only at the first few processors. None of those curves shows Puget Systems' calculated efficiency for long for long. When I say that exporting images and generating previews should be embarrassingly parallel, I mean that the time for the task should fall continuously with the number of processors.
> 
> ...



Still useful comparision to a degree. Not many people have more than eight core CPUs back home. I don't think that E5-2699v4/v5 is that prolific in home PCs


----------



## AvTvM (Jul 15, 2017)

@edelweiss: i thought i had made it clear that i dont like both: 1. lr using a database at all and 2. using a grossöy underperforming implementation of a database. i think it is proven beyond reasonable doubt, that LR performance problems - even on powerful hardware - are rooted in its database, since other raw converters/image editors - anything from canon DPP to Capture 1 pro to Adobe Bridge - does NOT have those performance problems.

i am fine with LR's (ACR) raw conversion (even though Canon DPP does an even better job on canon raws), i do like LR user interface and the editing functionality. While limited it is all i need. Only thing i don't like and don't need is the database. so for me the preferred solution would be a well performing "LR lite" with identical raw converter and image editor functionality but without database. i have no problem to find image files in windows thanks to my well thought out and disciplined file and folder naming scheme and structure. firthermore there are apps available to conduct AI-based content specific search for images. no manual image tagging needed. no lightroom database needed. fast, efficient, automatic.


----------



## LDS (Jul 17, 2017)

AvTvM said:


> i think it is proven beyond reasonable doubt, that LR performance problems - even on powerful hardware - are rooted in its database, since other raw converters/image editors - anything from canon dpp to capture 1 pro to adobe bridge - does NOT have those performance problems.



Here you can find a description of the LR architecture: http://www.troygaul.com/LrExposedC4.html. it's not up to date, but AFAIK it didn't change much since then.

It's not proven that SQLite is the issue. Your empirical observation is based on the fact that LR employs a database, the others don't, so the culprit must be the database. Unluckily, there are many other not so obvious details to take into consideration - i.e the use of Lua by LR.

SQLite is used in many products (see here https://sqlite.org/famous.html) - if it had so many performance issues, it would have been already replaced by something else.

While I would like LR could also support a stand-alone database (so it could be run on a different server, and several LR clients could connect and work on the same catalog...), it would be much heavier and difficult to install for single use than the one actually used by LR.

Anyway, what is important is to exactly pinpoint where performance issues arise. There are tools to measure quantitatively software performance and identify precisely where bottlenecks are. This is the only way to ensure the proper piece of software is optimized - but it's something only Adobe can do with the proper level of detail.

The fact they're asking user what they believe is a priority will tell them where to start to look at.


----------



## Mikehit (Jul 17, 2017)

I think there are two things driving this - firstly functionality was given a priority over speed and LR has now reached a stage of rapidly diminishing returns on the functionality. More functions mean more data in xmp files mean more data to handle. 
The rise of programs like Photomechanic and Breezebowser mean professionals are using those systems to cull photos but also as digital management because they are so much quicker for their purpose and that is Adobe's next step. 

Which actually begs the question of where people want the speed. If they want the speed of review and culling, then there is Photomechanic approach where you could go into a review module where you review the embedded raws for sharpness/composition and the only data you apply are 'delete' flag (and maybe a 'rating') and when you have been through your complete ingest of 5,000 photos you press a button to update the library. Then it goes as it is now.

Some people complain about the time it takes to apply edits and that would be completely different set of issues - but I have occasions where LR is quick and others where it is very slow and I wonder if that is due to other things the computer is doing rather than LR itself.


----------



## AvTvM (Jul 17, 2017)

Mikehit said:


> I think there are two things ...
> ... speed of review and culling ...
> ... time it takes to apply edits ...



I agree. Where I don't agree is, why has MIGHTY Adobe up to today NOT been able (or willing) to provide 
1. an IMPORT Module in LR, that is every bit as simple , fast and painless as PhotoMechanic? 
2. An EDIT module that performs fast by fully utilizing today's hardware (multi-core/Threaded CPUs, lotsa RAM, blazingly fast NVME SSDs etc.) 

My explanation: Adobe has been resting on their laurels or rather on their big fat corporate ass and rather than on product development they are primarily focussed on coercing as many users as possible into their rental subscription scheme and raking in the cash.


----------



## Mikehit (Jul 17, 2017)

AvTvM said:


> Mikehit said:
> 
> 
> > I think there are two things ...
> ...



Alternative reading: 
You don't change architecture and add functionality at the same time. They saw functionality as the more important to keep ahead of the competition. Now that is in place the architecture is coming under the spotlight more than it was before. 
A company like Adobe does not get to be where it is (and maintain that position) by incompetence or luck - and just because they don't agree with you. or you don't understand their decisions, doesn't mean they don't know what they are doing.


----------



## LDS (Jul 17, 2017)

AvTvM said:


> 1. an IMPORT Module in LR, that is every bit as simple , fast and painless as PhotoMechanic?



The import module of LR does more - it can generate different types or previews, convert to DNG, apply presets, etc. Photomechanics does less, does it better, and cost as much as LR. It's a much more focused applications - which justify its prices only if you need that focus. Probably when USB 2.0 and spinning disk were the norm, the LR code looked adequate - now with USB 3.0/Thunderbolt and SSD disks, it may be not.



AvTvM said:


> 2. An EDIT module that performs fast by fully utilizing today's hardware (multi-core/Threaded CPUs, lotsa RAM, blazingly fast NVME SSDs etc.)



Note than not everybody yet has twelve core, 128G of RAM and NVMe disks  LR is still a $150 application, often sold in bundle with something else, and could end to be used on far less powerful machines. You may need to find a balance between the system requirements, and performance. If it was aimed only at 7D/5D/1D users (or equivalent cameras from other brands) maybe they could think they have matching PCs - but there's a lot of less expensive cameras users, with less powerful PCs, among the LR users as well.



AvTvM said:


> My explanation: Adobe has been resting on their laurels



It's highly probable - as long as a product sell well enough there are little commercial reasons to invest a lot in modifying it deeply, because there are risks there will be more new bugs to chase and fix, and stability issues (just look at GPU support...). Look at how Canon itself is often conservative in new models - it's the same approach.

When competitors get close, or surpass you, if you're not stupid you understand the time has come for the required changes, and risks are balanced by the risk of losing market share.


----------



## AvTvM (Jul 17, 2017)

LDS said:


> AvTvM said:
> 
> 
> > 2. An EDIT module that performs fast by fully utilizing today's hardware (multi-core/Threaded CPUs, lotsa RAM, blazingly fast NVME SSDs etc.)
> ...



No, not everyone has a fast PC. But it won't matter with LR anyways. because LR does NOT take any advantage of powerful hardware, when it is there. LR lets it sit and idle ... that's the second aspect of the LR performance problem we are discussing here. The other aspect is the poorly performing, unnecessary database (both concept and implementation). 

There is simply no excuse. Adobe said, with CC cloud stuff and monthly payments will come regular ongoing improvements. Not true. Clear lie.


----------



## Mikehit (Jul 17, 2017)

LDS said:


> AvTvM said:
> 
> 
> > My explanation: Adobe has been resting on their laurels
> ...



is it the same approach? AvTvM is talking (as he usually does) about how the company is complacent and does not need to develop so just sits there raking the money in. You seem to be talking about the same thing I was on how Adobe (and Canon) choose to apply their R&D budget - and because it does not chime with how AvTvM want them to spend it he takes this as laziness.


----------



## LDS (Jul 17, 2017)

Mikehit said:


> is it the same approach? AvTvM is talking (as he usually does) about how the company is complacent and does not need to develop so just sits there raking the money in. You seem to be talking about the same thing I was on how Adobe (and Canon) choose to apply their R&D budget - and because it does not chime with how AvTvM want them to spend it he takes this as laziness.



It is also true that commercial entities like easy money - not pleasing users if they don't see much more profits, and since the mantra "maximizing stakeholders value" was hammered into managers heads at MBAs, they need really good reasons to increase R&D beyond what they believe is adequate to maintain market share when they're already the leaders.

And since often the "stakeholders" are the CEOs and other board people, they will happily funnel money in dividends and buybacks instead of R&D. Some understand how to balance that well enough, others don't, and their company may go down the sink.


----------



## EdelweissPirate (Jul 17, 2017)

AvTvM said:


> @edelweiss: i thought i had made it clear that i dont like [Lightroom] using a grossöy underperforming implementation of a database.



Yeah...this is why I suspect you don't understand databases in general and SQLite in particular. SQLite is a stripped-down, speedy database. I understand that you think a database is unnecessary, but this is one of the fastest Adobe could possibly have used. SQLite is open-source; Adobe didn't implement it—they just used it. There's a big difference. 

I'm not trying to make fun of you for not knowing these things. It's just that in order for your point to be valid, it requires a falsehood (that SQLite is an inherently slow database) to be true.

But let's put that aside for now. What operation do you think Lightroom is doing via its database that would be so much faster without one? I'm not asking what you find slow about Lightroom, though there are many aspects that could be faster. I'm asking what low-level operation you imagine is slowed down by Lightroom's use of SQLite.

I agree with LDS that the fact that the UI is written largely in Lua might have something to do with it. In most cases, Lua code is run in a user-transparent virtual machine, and that may be responsible for a lot of the UI lag. But I don't know enough about how Lightroom uses Lua to say for sure...Adobe could be using some fancy Lua-to-compiled-code maneuver (not standard compiled Lua bytecode) that's beyond my ken. 


[quote author=AvTvM]i think it is proven [/quote] I don't think that word means what you think it means. 



> beyond a reasonable doubt, that LR performance problems - even on powerful hardware - are rooted in its database, since other raw converters/image editors - anything from canon DPP to Capture 1 pro to Adobe Bridge - does NOT have those performance problems.



This is a straightforward post hoc fallacy. Again, I'm not mocking you here; I'm just trying to point out that Lightroom's speed problem (which is absolutely real) isn't intrinsic to its use of a database.

[quote author=AvTvM]so for me the preferred solution would be a well performing "LR lite" with identical raw converter and image editor functionality but without database. [/quote]
Then why not just use ACR plus Photoshop?


[quote author=AvTvM] i have no problem to find image files in windows thanks to my well thought out and disciplined file and folder naming scheme and structure.[/quote]

Whether you understand it or not, your system is a database (albeit a comparatively slow one requiring a meatspace interpreter).

[quote author=AvTvM]firthermore there are apps available to conduct AI-based content specific search for images. no manual image tagging needed. no lightroom database needed.[/quote]

As others have pointed out, all of these apps have their own databases. You've taken the database out of Lightroom, but you've still got the database in your workflow.


----------



## Valvebounce (Jul 18, 2017)

Hi EdelweissPirate. 
Funniest thing I have read today. ;D ;D ;D 

Cheers, Graham. 

Quote
"Whether you understand it or not, your system is a database (albeit a comparatively slow one requiring a meatspace interpreter)."
End quote.


----------



## haversian (Jul 18, 2017)

[I switched to computer engineering about 3/4 of the way through my CS degree, so while I have an adequate grasp of algorithmic complexity, my professional experience is all in hardware design, not software.]

Lightroom's scaling suggests to me that Adobe did the 'hard' work of making a multi-threaded RAW processor, but not the comparatively 'easy' work of spawning mutliple instances of it to achieve near-linear speed-up on embarrassingly parallel tasks. Granted, there would be some overhead for inter-process communication and they might have to serialize database access, but the heavy lifting is all in the image processing. And they don't appear to have done any work to predict users' future actions and prepare for them in advance.

Export to Disk is the only test that Puget did which exhibits reasonably linear scaling, and even then only out to 8-10 cores. I would have liked to see Puget re-run their test with two simultaneous exports of 40 images each, rather than one export of 80. Does that improve the scaling, perhaps by forcing LR to spawn a second worker thread? I haven't been able to get good data running such tests myself, but I only have 4 physical (+4 logical) cores to play with, so the fact that LR export scales reasonably linearly to that point would tend to obscure any advantage I might see from the hypothetical second worker thread.

Convert to DNG should be just as embarrassingly parallel as Export, but its scaling behavior is quite different. Speed-up from the second core is roughly linear, but the third and fourth core do very little, and beyond that there's negligible additional scaling. Conceptually, Convert is the same render-to-bitmap operation as Export, followed by encode-to-DNG rather than encode-to-JPG. However, as we can see from the fact that Convert is ~3x as fast on 1-2 cores as Export, the algorithm appears to be avoiding a lot of the work of manipulating image data that Export does. Since the disk bandwidth is quite low (tens of MB / sec maximum) in both cases, either LR is phenomenally inefficient at file access (this seems very unlikely since there's no obvious reason they wouldn't load an entire file into RAM and then flush an entire file to disk when done), or the bottleneck is elsewhere. On a system with 20MB of cache, nearly able to cache an entire RAW image from Puget's test, it seems unlikely that memory bandwidth is the limiting factor.

Generate 1:1 Previews and Generate Smart Previews both have similar scaling limits to Convert: peak performance is reached at about 4 cores, with best performance being only 2-3x as high as single-core performance.

Since there aren't any obvious resource limitations to better performance (the tasks are not CPU bound, not disk I/O bound, probably not memory bandwidth or latency bound, and don't have clear inter-process communication or coordination limits). All of which leads me to believe that LR was architected to minimize latency (it tries to process a single image as fast as it can, via a multithreaded rendering process) rather than maximize throughput (operations per hour). Though it's pure speculation on my part, I would guess that rendering is probably pipelined (eg demosaic -> user edits -> noise reduction -> sharpen - and yes, I realize that's not the order LR does those steps: this is an example of pipelining) and scaling is limited by the slowest pipeline stage and the total number of pipeline stages.

(It will be interesting when AMD's Threadripper CPUs get into users' hands to see how LR scaling changes in response to the different hardware behavior (such as markedly lower single-threaded memory bandwidth, but much higher multithreaded bandwidth; or the very different cache organization and performance characteristics) that the new platform brings.)

But despite this hypothetical focus on latency minimization, there seems to have been little to no effort at latency hiding. For example, when I'm viewing a photo in the library module, LR doesn't seem to pre-render the next and previous photos so I can 'instantly' move forward or back in the film strip. It doesn't pre-load everything it would need if I were to flip to the develop module to make some quick edits. All three of those operations are high-probability guesses as to what my next action will be, and are good targets for trading power efficiency for higher productivity.

I would like to see Lightroom make use of idle compute power so that repetitive, predictable operations are fast because LR has already anticipated and completed the thing I'm about to ask it to do.


----------



## LDS (Jul 18, 2017)

haversian said:


> On a system with 20MB of cache, nearly able to cache an entire RAW image from Puget's test, it seems unlikely



What 20MB cache are referring to? The CPU one? 

AFAIK, when a RAW is loaded into memory it gets far more RAM than its disk size. The Adobe document I posted above says that a 40Mpx image can take 240MB of RAM, and between 0.5 and 1GB with edits applied. Even halved for a 20Mpx image, they are still far more than 20MB.

Anyway, the CPU L3 cache is shared across all cores, it is shared across all the applications running on the PC (including the OS itself and the many background services) - and Lightroom has very little control on what is in the cache at any given time - but optimizing the code for cache access (I'll avoid to go too technical in this forum), and hope 

Actually, more parallel executions could mean more CPU cache contention - depending on what the processes are doing.

Increasing the Camera Raw cache - albeit disk based, but under full LR control, could improve performances, because LR can skip some processing stages if the image is cached.


----------



## EdelweissPirate (Jul 18, 2017)

Cheers, Graham...I'm glad you liked the turn of phrase. 

LDS, you're right that Canon raw files are compressed. I have a 7D (MK I) and its ~18 MP images should decompress from raw to about 31.3 MB in memory:

5184 * 3456 pixels * 14 bits per pixel = 250822656 Bits ~ 31.3 megabytes

The 5D MK IV has these specs:

6720 * 4480 pixels * 14 BPP = 421478400 bits ~ 52.7 MB

Many editing operations are straight-up transforms of the raw-image matrix, so you could cache the results of each editing step with an uncompressed image roughly the same size as the original raw file. If I'm right about that, then to a first approximation, a 5D MK IV raw image with about 15 cached, rendered edits should take about 1 GB of RAM to store when using the editing module. 

That jibes with the Adobe document you posted. CPUs don't have nearly that much cache on silicon, of course. But many serious LR users have 32-64 GB of RAM, and in such cases LR would have enough resources to store quite a few pre-rendered images in RAM. 

Haversian, thanks for the informed commentary. I think you're right that LR does zero (or nearly zero) speculative rendering to hide latency. It seems like such low-hanging fruit for the developers that I can't help wondering whether it was a design decision to prevent LR from seeming to "churn" in the background while seemingly doing nothing. Being more aggressive about speculative execution would certainly make better use of 6+ physical CPUs than LR does now.


----------



## Khalai (Jul 18, 2017)

LDS said:


> Increasing the Camera Raw cache - albeit disk based, but under full LR control, could improve performances, because LR can skip some processing stages if the image is cached.



I have allocated 30 GB of my Samsung 950 Pro NVMe PCIe drive for caching, while having 32 GB RAM as well. Doesn't help much really...


----------



## LDS (Jul 18, 2017)

EdelweissPirate said:


> LDS, you're right that Canon raw files are compressed. I have a 7D (MK I) and its ~18 MP images should decompress from raw to about 31.3 MB in memory:
> 
> 5184 * 3456 pixels * 14 bits per pixel = 250822656 Bits ~ 31.3 megabytes



You forgot the color planes  After demosaicing you get an image with more than the 14 bit per pixel of the pure pixel DR.

Let's remember LR uses a slightly modified version of ProPhoto RGB color space, which uses more than the 24 bit per pixel than sRGB (which would be rounded anyway to 32 bit because of the way CPUs work, using half a byte would make computations much more complex). 

To preserver each color range, at least, LR will need to use 16 bit per color per pixel which means 48 bit per pixel - which I guess will be rounded to 64, depending on how LR encodes those data in memory, but again usually CPU instructions are optimized to work on 32/64/128 bit values.

So your 7D image would be 68MB for 32 bit, and 136 for 64 bit. That would become 114/229 for the 5D4, which is in line with what the Adobe presentation says, and makes me think LR uses 64 bit per pixel.

That also means there's a question on how good is LR in exploiting the CPU advanced instructions (the various versions of MMX, SSE and AVX) designed to improve the performance of this kind of processing (even without using the GPU) - although again it has to cope with the fact that not all the supported CPUs may have the latest ones.

The fact that the system requirements (https://helpx.adobe.com/lightroom/system-requirements.html) doesn't specify much about the processor, makes me think LR only uses the least common denominator.


----------



## EdelweissPirate (Jul 18, 2017)

Thanks, LDS, for expanding on my post and filling in the stuff I missed. I'm interested in these things, but my field is mechanical engineering, not computer science or software engineering. 

A quick Google search implies that Lightroom is only compiled against SSE2 and nothing later. On the other hand, I think the real question is: what instruction sets is ACR compiled against? I'd expect it's the same, but maybe others here have better information.

Thanks again for correcting my post.


----------



## LDS (Jul 18, 2017)

EdelweissPirate said:


> A quick Google search implies that Lightroom is only compiled against SSE2 and nothing later. On the other hand, I think the real question is: what instruction sets is ACR compiled against? I'd expect it's the same, but maybe others here have better information.



AFAIK LR and ACR share the same code for the features they have in common. Then on top of that LR uses the Lua engine, and it would be interesting if and how well it can again take advantage of more powerful instructions when available.

It looks some other Adobe products can use more advanced instruction sets, but those are the one more aimed at a more professional user, thereby less issue to increase the hardware requirements.

IMHO LR has room for big improvements, but the price may be to remove support for some older processors, and may require some deep code changes.


----------



## haversian (Jul 19, 2017)

LDS said:


> What 20MB cache are referring to? The CPU one?



Yes. But you're right, LR almost certainly uncompresses the image to 12-24 bytes per pixel (4-8 bytes per color channel) in RAM before doing any work on it, so my reference to the RAW size was at best misleading. An uncompressed image would get you a larger working set, but more regular memory accesses since it's a simple 2D array, which memory prefetchers are well optimized to handle. Either way, that's slower than my RAW size reference would imply.

And that's a good comment about cache contention. I'm in the wrong field to be able to usefully speculate about whether LR's image processing algorithms look more like a streaming application, or more like something that would provoke fights over the cache. Surely Adobe has thoroughly profiled the code, but of course they're not going to share with us their results. I'll have to do some digging to see if I can find anyone else who has done that research.


----------



## LDS (Jul 19, 2017)

In this thread:

https://feedback.photoshop.com/photoshop_family/topics/lightroom-clone-and-brush-tool-can-not-stress-the-cpu-is-slow-only-on-cpu-with-xeon-architectures-can-confirm?topic-reply-list%5Bsettings%5D%5Bfilter_by%5D=all&topic-reply-list%5Bsettings%5D%5Bpage%5D=2#topic-reply-list

Simon Chen (one the key people in LR development) offers a tweak to change how LR uses CPUs, and explains the design trade off Adobe made in developing the import stage.


----------



## Diko (Jul 20, 2017)

EdelweissPirate said:


> The association between Lightroom's database and its performance issues exists only in your imagination. It's a mistake to claim that SQLite is an _a priori_ cause of Lightroom's performance issues.


 Do me a favour and go try putting 20 healing spots on an image. Do that for the next 10 out of 50. Let me know if it gets sluggish when you apply to the next image. Also walk around the images. 

And if you ask me what that has to do with the SQLite I think our debate would end here ;-)

Now I read carefully the posts and tend to believe that most people here don't realize that there IS a scenario in which all performance issues are not related. Not caused by one and the same bottleneck.

Also all more knowledgeable IT guys that have posted seem to omit the possibility that everything is the cause.

- SQLlite
-LUA (I forgot to rant about it in my previous posts)
- Cocoa and Silver

Each of the above has its own merits and setbacks. If I recall correctly the memory leaks issues began after the transition to LUA, which was for better API for users to create custom plugins (only presumption) and also as they stated in the mentioned presentation to better get core API functions calls easily . But as noted - it is metalanguage. It needs an interpreter (virtual machine) to run. It is more like a script language than a true language.

Each one of us has experienced one or another issue. Adobe should aim at updating its core engines to include more than SSE2 (2001).to something more advanced like AVX2 (2013), which was 4 years ago, which means that if you edit photos on so old computer you need an upgrade or at least don't need that much professional software. 


And all that being said it is not true that ONLY SSE2 is being used. Check the same link with Simon Chen.

Camera Raw SIMD optimization: SSE2,*AVX*,*AVX2*. But this is ONLY one of the Tools in LR!

I find this little config.lua tweak a great start in troubleshooting the performance issues 


Thank you Adobe!


----------



## TomDibble (Jul 22, 2017)

Diko said:


> EdelweissPirate said:
> 
> 
> > The association between Lightroom's database and its performance issues exists only in your imagination. It's a mistake to claim that SQLite is an _a priori_ cause of Lightroom's performance issues.
> ...



Umm, okay. Would the debate end because you don't understand what would be in the database in such a case?

This is exactly the sort of thing that is *not* exercising the database (unless Adobe's engineers are grossly incompetent, but if you believe that then why are you even giving their products a second glance?)

There are several databases in Lightroom:
[list type=decimal]
[*]The catalog, which stores references to all the images on disk, as well as a cache of the modification instructions for those images.
[*]The "Previews" (1:1 and Standard) cache, which caches rendered 1:1 and "standard" size renders of the final images
[*]The "Smart Previews" cache, which caches the original raw data compressed so that basic changes can be made in the UI before the original RAW file is pulled up from disk and if the original RAW file is missing
[/list]

There might be more in some circumstances, but those are the biggies. Also note that almost everything above is categorized as "cache". That means, the actual source of record for that information is elsewhere - for 1:1 previews, that is the original RAW file plus the list of instructions applied to it as described in the sidecar file. These databases' whole existence is because *they fixed performance problems in the non-DB-based early versions of Lightroom*.

I'm assuming you are trying to say that the first database (the catalog) is the issue here, as it is what contains the stack of changes per image pulled from disk (where those healing brush source and destinations are, and the geometry of the spots). That is a fairly small amount of data to store (the mask of the healing image, which should be an 8-bit-per-pixel RLE-compressed bitmap unless, again, Adobe's engineers are wholly incompetent, is the biggest bit), and it is stored alongside all the other Develop instructions stack for each image.

Let's look at what happens when you go from image to image in the Develop module:

[list type=decimal]
[*]The Smart Preview is pulled up, if in cache, which is by all accounts instantaneous
[*]The RAW file is pulled into memory if available, replacing the Smart Preview. This may take some time as it is a disk access.
[*]The list of changes is pulled out of the Catalog database
[*]Each change is applied to the image in turn, using the rendering engine appropriate for the change (ex, rotator and cropper, spot removal, exposure, USM for sharpening, etc).
[*]The rendered image is displayed on screen
[/list]

If the third step above were a problem, it would be a problem no matter what is in that list. For instance, rotate all your images slightly, adjust the exposure, add a contrast curve, etc. From a database perspective, 99% of the cost of step 3 is the lookup - finding the row in the database - while the rest of the cost is pulling the data out. I'm not sure about the specific RDBMS table structure in Lightroom's database, but assuming the developers know what they are doing, there are only a few possibilities that really make sense. I'd guess that likely the list of changes to apply is kept in a child table related to the parent "Photo" table.

But, we don't see such a drag just by doing "any" set of changes to the images. We need to do computationally-intense (relatively) changes to result in a measurable slowdown.

This is exactly what we would expect if #4 above is the bottleneck, because that is the first point in the whole process where an image that has ten crop/rotates and ten exposure adjustments looks different from one which has twenty healing brush adjustments.



> Now I read carefully the posts and tend to believe that most people here don't realize that there IS a scenario in which all performance issues are not related. Not caused by one and the same bottleneck.
> 
> Also all more knowledgeable IT guys that have posted seem to omit the possibility that everything is the cause.
> 
> ...



I've been involved with enough refactors and language rewrites to know that the language chosen can make a big difference in performance, but "interpreted" languages are not necessarily worse for a task. If you are dealing with evolving algorithms, in fact, moving to a higher-level, "less efficient" language will often allow for significant performance boosts at the algorithm level which would have been impractical with the "more efficient" language. We have a school scheduling engine which I've rewritten a few times now, and as one example moving from C++ to Java with the original algorithm intact cost us about 20% performance (this was back in Java 4 days, without the great JIT compilers we enjoy now), but allowed us to put a much more elegant algorithm in place which yielded gains at the 10,000% level and in some cases better (a build on track for 15 years to complete in the old codebase completing in 200 milliseconds).

Now, I can't speak to Adobe's use of Lua. But, it can be compiled all the way to machine code, and Lua can be run in a JIT-supporting bytecode interpreter, although the quality of the Lua JITs might not be anywhere near those found in .Net or Java VMs. I also don't know how "deep" Adobe's use of Lua is, if it is just at the UI and API levels or if it actually runs any of the image manipulation algorithms. I would not expect it to be used in the latter case, generally.

As for any of these being the primary bottleneck in the "moving from image to image" performance problem, I'd say Lua is much more likely to be the problem than SQLLite. Not sure what Cocoa has to do with it (this is a problem on Windows too, right?) or what you mean by "Silver" (Silverlight? Wouldn't Adobe more likely have Flash in there if anything?)



> Each one of us has experienced one or another issue. Adobe should aim at updating its core engines to include more than SSE2 (2001).to something more advanced like AVX2 (2013), which was 4 years ago, which means that if you edit photos on so old computer you need an upgrade or at least don't need that much professional software.



Agreed that the use of SSE2 only (and none of the expanded newer instruction sets) is, if true, definitely going to be a performance problem, and will show up exactly where we see it: applying adjustments is just plain slow. Adobe is likely using a modular architecture which would allow them to compile key image-processing modules with several levels (SSE2 at one end and the latest at the other end, perhaps nothing-newer-than-five-years in the middle) and pull in the dylib/DLL appropriate to the host machine's architecture at runtime. This is well-trodden territory, and would not require Adobe to "write off" even customers on the oldest hardware (although at a cost of potentially having to code these low-level optimizations three times instead of once). But, as you point out later, it seems like "SSE2 only" is another one of those myths, a tidy little story people tell themselves to explain why Lightroom's performance sucks in several situations.

IMHO, more likely the issue is not with the database, or with the use of Lua, or even with the raw performance of the image manipulation steps (because I suspect they are tightly tuned to do what they do without having quality issues). More likely the root issue is that Lightroom constantly *throws its work away* instead of storing it or caching it. At the same time, it sits nearly 100% idle much of the time, wasting potential CPU cycles.

There is absolutely NO reason why if I apply a bunch of changes to one image in Develop (which it renders on screen), click on another image in the filmstrip, then click back, that there should be ANY delay in showing me the fully rendered image with all its changes. *It just did the full render*. But, it threw everything away when I went to look at another image in the filmstrip for reference.

There is also absolutely NO reason why if I am flipping through images in the Library or Develop module and pause for a second on one image, I shouldn't be able to flip to the next two or three without any rendering delay. Instead, while I am looking at image 27/300, Lightroom is twiddling its thumbs and dreaming of unicorns. It should be anticipating my next move: the guy just went from image 1 to 2, then 2 to 3, etc, then 26 to 27; they are likely going to want 28, 29, and 30 next).

The problem with the "the issue is the database" myth is that to do either of the above *real, actual* improvements to Lightroom performance, means putting *more* cached processed data into a cache database (which, ultimately, is pretty much what all of the Lightroom databases are, other than the specific catalog information). And, to do that, Adobe needs to improve its underlying database management procedures (there is no reason why Lightroom can't automatically detect changes on disk without us having to tell it to rebuild the cache via "Synchronize Folder", and there is no reason why we shouldn't have significantly better control over how long 1:1 previews and the like hang around in the caches, even to the point of being able to remove a particular image from caches). The issue isn't "too much database", it is "too much calculated on-the-fly / too little database".


----------



## LDS (Jul 23, 2017)

"Silver" is the name given by Adobe to the LR user interface, which is not the OS native one, and aims to be the same on both Windows and macOS. There's a JIT for Lua, but does LR use it?
Pre-rendering other images may have drawbacks as well. It takes resources, which becomes not available for the foreground task, and it's wasted work if the user doesn't work sequentially - I often move much more 'randomly' after the initial culling.
Probably LR should offer the user options to select what behavior they prefer, depending on how they use LR and how powerful their computer is. It makes the application more complex, but a single compromise may not fit everybody.


----------



## TomDibble (Jul 23, 2017)

Agreed; options are good, and not everyone's workflows are the same. LR should be able to learn from what you do, though, and work ahead accordingly. Simple predictive workflows are not that hard to implement especially in Lightrooms modal flow (you have a set of heuristics for the Library module, another for the Develop, etc), and if they do it right the only option would need to be how much background processing time they take, how quickly things get canceled if something else needs the CPU, etc.

Never knew that LR's UI was called "Silver". I'm not a fan of Adobe's need to make a nonstandard UI for all of their products, but in the list of gripes against LR i rate it as fairly low. Still, the non-standard UI might well be a performance issue (much better to leave that kind of thing to the OS developers!)


----------



## LDS (Jul 23, 2017)

TomDibble said:


> Still, the non-standard UI might well be a performance issue (much better to leave that kind of thing to the OS developers!)



Again, Adobe needs to support both Windows and macOS, and ensure LR works exactly the same on both, including plug-ins. Usually, it's much easier to make an application faster when there are no cross-platform requirements and layers.

Also think what would happen to all those people writing books, publishing books and offering courses if LR had two different UIs, one for macOS and one for Windows... 

Moreover, all the cross-platform UI I saw till now are no better than Silver. I'll give a look at how Affinity handled this issue.


----------



## Diko (Jul 29, 2017)

TomDibble said:


> Never knew that LR's UI was called "Silver"...


 Guess the name of the custom-tailored color space ;-)

*Melissa RGB*: _It is using ProPhoto RGB chromaticities, but with a gamma of 1.0 instead of 1.8. Meanwhile, the Lightroom viewing space uses the same ProPhoto RGB chromaticities but with an sRGB tone response curve. Melissa Gaul, who was the QE manager for Lightroom, suggested this space should be called Melissa RGB since all RGB spaces to date have been named after men!_

I find it to be a smart move.


----------



## Diko (Jul 29, 2017)

TomDibble said:


> Umm, okay. Would the debate end because you don't understand what would be in the database in such a case?
> ....
> Let's look at what happens when you go from image to image in the Develop module:
> ..
> ...


 
Really?!? After all being said you really have to omit the addressing of *local *changes?!? Crop, curve and all the rest compared to local changes are significantly smaller records. In my case I have seen the behaviour on *10, 30, 50 megapixels*. Sluggishness correlates on pixel quantity. Don't even need to mention the 100 mps, cause there I even don't try to use local adjustments for that reason 

Even if not using absolute addressing, but relative the quantity of local adjustment records still depends on number of corrections, no matter of spot heal, or a brush. And you really haven't tried what I asked you to, did you ;-)

As for the rest, seems to be on the same or similar opinion. My take as I mentioned to biggest extend are presumptions based on observations and knowledge.

However on LUA - have no idea. From my experience interpreters tends to slow things... but I am not as proficient in the field as you seem to be. 

However my main point is still valid:

1/ Different people complain from different issues. E.g I have never experienced slowdown caused by GPU compared to the sufferings of so many others. Ergo It can't be *ONLY ONE bottleneck* or bug. Additionally people are not using the same hardware, but software as well like OS and most importantly workflows - I don't like stains, pimples, too dark circles under the eyes etc. ergo huge usage of local adjustments and healing spot whereby I rarely go out of automatic (added when importing) exposures. And tiny crop corrections now and then. 

IMHO all that makes it hard to solve all the mysteries surrounding the epic slowness of LR. It is simply different issues with different people. 



TomDibble said:


> The problem with the "the issue is the database" myth is that to do either of the above *real, actual* improvements to Lightroom performance, means putting *more* cached processed data into a cache database (which, ultimately, is pretty much what all of the Lightroom databases are, other than the specific catalog information).



Some versions ago they had a bug with cache...overflow? or leak... (don't know how to put it in english). One of the LR's caches was getting way too big cluttering users' hard drive space... and then they simply "fixed" it. I guess they tried to overcome slowness by utilizing caches but somehow didn't worked right for all. And.... yeah... someone noticed the non-stop disk writes worrying it will аmortizes the SSDs.

In the end - I also tend to agree for the behavioural pattern recognition for predictive system utilization... but I am also afraid of the one Microsoft tried to pull of for so many years... It may happen successfully, but at the cost of our own suffering for at least two major LR releases.


----------



## LDS (Jul 29, 2017)

Diko said:


> Really?!? After all being said you really have to omit the addressing of *local *changes?!? Crop, curve and all the rest compared to local changes are significantly smaller records. In my case I have seen the behaviour on *10, 30, 50 megapixels*. Sluggishness correlates on pixel quantity. Don't even need to mention the 100 mps, cause there I even don't try to use local adjustments for that reason
> 
> Even if not using absolute addressing, but relative the quantity of local adjustment records still depends on number of corrections, no matter of spot heal, or a brush. And you really haven't tried what I asked you to, did you ;-)



Look at the sidecar XMP files. They store the same information the database does. It's no surprise higher mpx means slower performance, they are more computationally intensive. The same data will be stored in the database or XMP file, but more processing will be needed to apply the same processing to a bigger image.

These, for example, here are the data for an heal brush stroke:

<crs:RetouchAreas>
<rdf:Seq>
<rdf:li>
<rdfescription
crs:SpotType="heal"
crs:SourceState="sourceAutoComputed"
crs:Method="gaussian"
crs:SourceX="0.646875"
crs:OffsetY="0.421875"
crs:Opacity="1.000000"
crs:Feather="0.000000"
crs:Seed="+2">
<crs:Masks>
<rdf:Seq>
<rdf:li
crs:What="Mask/Ellipse"
crs:MaskValue="1.000000"
crs:X="0.642014"
crs:Y="0.405729"
crs:SizeX="0.005382"
crs:SizeY="0.005382"
crs:Alpha="0.000000"
crs:CenterValue="1.000000"
crserimeterValue="0.000000"/>
</rdf:Seq>
</crs:Masks>
</rdfescription>
</rdf:li>

The data doesn't depend on the image pixel count, nor does it stores pixel data - it just stores the parameters required to recompute the stroke. These are the same data used by ACR - the engine is the same. Just, probably, most people using ACR will make local changes using Photoshop - which works differently.

LR doesn't apply the changes in the order you applied them - the processing pipeline knows what is the best order to apply them, so one change may require to compute not only it, but subsequent changes that depends on it too.

What LR really needs is to improve the performance of its processing pipeline - reading the "recipe" from a xmp file, or SQLite database, probably is not the issue.


----------



## justsomedude (Jul 30, 2017)

Post Deleted.


----------



## justsomedude (Jul 30, 2017)

LDS said:


> Are you sure you can freely talk about it, and you didn't sign an NDA? Check, and if so, delete your post.



There was no NDA. However, there was a "Agree to the Terms and Conditions" checkbox, which I did not read.  Just to be safe, I'll delete my post.


----------



## LDS (Jul 30, 2017)

justsomedude said:


> There was no NDA. However, there was a "Agree to the Terms and Conditions" checkbox, which I did not read.  Just to be safe, I'll delete my post.



Remove any remaining reference too...  I removed my answer to your for the same reason.


----------



## canonlover (Jul 30, 2017)

Im waiting on LR7 before i upgrade. Im on 4. Yes i have no interest in getting that cloud thing so if we get speed improvements im all game


----------



## Click (Sep 10, 2017)

I only wish that LR was still available in stand alone version...


----------



## privatebydesign (Sep 11, 2017)

Click said:


> I only wish that LR was still available in stand alone version...



It is.

https://www.bhphotovideo.com/bnh/controller/home?A=details&O=&Q=&ap=y&c3api=1876%2C%7Bcreative%7D%2C%7Bkeyword%7D&gclid=EAIaIQobChMI4s_7rqmc1gIVSiSGCh1AngwBEAQYASABEgJyf_D_BwE&is=REG&m=Y&sku=1140015


----------



## Click (Sep 11, 2017)

PBD you're the best! 

Thank you, Sir!


----------

