Can you share the editable file of this design? I might want the circles to be green in case that looks good with the design. See also ๐ Project Browser: Create logo for Tamper Needs review .
Can you share the edit file of this logo? I think I would like the image a bit bigger and I would like the circle to be green. As I think the color green communicates that the object is now "good", as in you managed to transform the data in the way you wanted it to be. I'm also still considering my own design, but I don't know if that fits with the proposed logo for Feeds Tamper: ๐ Create logo for Feeds Tamper for Project Browser Initiative Active .
The tests look like they need work. I get indeed a random test failure.
I see that RedirectTest is failing, that's unrelated to this issue, but it's caused by that the signature of createUser()
has changed in Drupal. It now expects a list of permissions as first parameter.
In the MR I provided a fix and tests. Some code for the tests are copied from the Commerce Abandoned Cart tests. I did try to programmatically created order by reusing code from RedirectTest, but I ran into the issue that the path /cart did not exist, probably related to the code that fakes a request.
I hope that the tests pass. I just noticed that there might be a random test failure happening.
megachriz โ created an issue.
Oops, uploaded the one that wasn't compressed.
I updated the logo for Feeds Extensible Parsers, based on the new Feeds logo.
I do had the idea to include something from the following too, but I wasn't sure yet how to nicely incorporate them.
Xpath:
QueryPath:
megachriz โ created an issue.
I'm still using the patch from #79. I tried to add test coverage for it sometime ago, but I didn't find a way yet to write a test that failed without the fix. The test might have failed locally at the time.
The change is failing tests.
This is still an issue. You cannot rely on routing when requesting the available checkout steps outside the checkout process.
Use case: display the checkout steps on the /cart page. Multiple carts can exist on this page, so there is no order in the route.
@jacobupal
Well, there is more config in Drupal that you cannot import if other config does not exist. And you could still technically apply the workaround if you remove the dependency line from the config file.
Alright, thanks for reporting back! Closing this issue.
@jacobupal
Prevented. If a feed type would have a dependency on the feeds_item field, you cannot import the feed type configuration if the feeds_item field does not exist. So in that case you would also need to import the feeds_item field configuration.
@jacobupal
The feed type indeed cannot work correctly if you don't have the feed_item field. So perhaps the bug is here that the feed type doesn't have declared a dependency on the feed_item field.
Can you provide an export of your feed type configuration and a sample source file?
Also, I think that this issue belongs to Feeds Extensible Parsers?
megachriz โ created an issue.
Thanks for reporting back! I close this issue then.
That's odd, I don't see getFromAddress()
being declared twice in this file:
https://git.drupalcode.org/project/commerce_abandoned_carts/-/blob/2.1.x...
Also, when I download 2.1.x-dev module manually and search for "getFromAddress", it only appears in the file AbandonedCartMail.php.
I also don't see protected $time;
twice in AbandonedCartMail.php.
Do you perhaps have patched applied to the module? Or maybe you have two copies of this module on the system?
@batigolix and I looked at this issue yesterday at the DICTU Drupal Developers Day in Assen.
@batigolix successfully imported a media image on a node. There were some errors, however:
- When mapping to a media field first and then checkout the code from this issue he got an error for
getSummary()
. - The first import test resulted into a SQL error related to language (language cannot be NULL). When changing the language on the target then to "English", the import went fine. It could be that there's an issue when creating a Media entity when the language is not specified. I did see that on his configuration the language on the target was set to an empty string, so perhaps that is related to the error.
@batigolix is working on a test, but ran into the error "filename does not exist", which I did not have an explanation for right away.
@arccoss and I looked at this issue yesterday at the DICTU Drupal Developers Day in Assen.
@arccoss confirmed that the MR !89 fixes the issue for images. We also found out why the test is currently failing. It is because the call to Drupal\feeds\Feeds\Target\File::getMimeSubtypeFromUrl()
doesn't work in a Kernel test. The method is using cURL and since a Kernel test only has PHP and a database, but not an accessible site via HTTP, trying to retrieve something from a url like "http://localhost/modules/custom/feeds/tests/resources/assets/attersee_no..." will result into a 404 HTML-document, hence why the test said the extension is html, because that is what Drupal\feeds\Feeds\Target\File::getMimeSubtypeFromUrl()
will get from a 404.
A way to deal with this and keep the Kernel test, is to somehow override the curl part of Drupal\feeds\Feeds\Target\File::getMimeSubtypeFromUrl()
in the test. So here are some thoughts about what we could do:
- The method could be split up into two parts, one that does the curl thing and one that does the "finfo" thing.
- In FileTest.php and in ImageTest.php we add a class that extends
Drupal\feeds\Feeds\Target\File
andDrupal\feeds\Feeds\Target\Image
respectively and changeFileTest::getTargetPluginClass()
andImageTest::getTargetPluginClass()
to return the newly created classes. In these classes a method would need to be overridden: getMimeSubtypeFromUrl()
could be completely overridden, returning just an extension string.- Or just the curl part would get overridden, but that method override must be capable of returning something similar as what curl can do and I don't know enough about curl to know if that's doable.
- An other thought, instead of using curl directly here, replace the logic with using a service. I'm not sure if the same result could be achieved using Guzzle?
- I believe that there's also a possibility that curl is not available and perhaps the code should also account for that situation. At least in the D7 version of Feeds there was a check for this in the function
http_request_use_curl()
. Maybe we can avoid doing a check like this if we can replace the usage of curl with a service, as said in the point above.
I also wondered why what there is another MR here. It seems like an other approach, but there's no clear explanation of why it exists.
@imash Can you explain why you opened another MR? Did the other MR not work or does your MR solves something that is not covered by the other one?
I see that the latest patch includes other changes that were made after rc2 (not related to this issue), like coding standard fixes. So it makes sense that the latest patch won't apply on rc3.
I think that you need to update Coder module, because coding standard rules changed recently. The changes in the MR now introduce coding standard issues instead of fixing them.
See also the following commits:
https://git.drupalcode.org/project/floating_block/-/commit/e63b808c79208... (floating_block_test.css)
https://git.drupalcode.org/project/floating_block/-/commit/d10c07dacdef5... (HelperTest.php)
@rob230
A new release has been created!
https://www.drupal.org/project/feeds/releases/8.x-3.0-rc3 โ
@rob230
I wasn't aware that this issue caused a WSOD, I thought it generated only some errors on the feed page.
Thanks for creating the patch, I can see that applying the diff could be troublesome because of the binary file. I hadn't tried applying the diff with Composer, I only tried it like this:
curl https://git.drupalcode.org/project/feeds/-/merge_requests/199.diff | git apply
curl https://git.drupalcode.org/project/feeds/-/merge_requests/205.diff | git apply
I didn't think about that for Composer it could cause trouble.
While I think this issue doesn't affect everyone who updates Feeds (it only affects sites that have unfinished imports at the time of updating), good call on creating a new release. I can do that this Thursday.
@rob230
On top of 8.x-3.0-rc2:
- https://git.drupalcode.org/project/feeds/-/merge_requests/199.diff from ๐ Error: Serialization of 'CurlHandle' is not allowed in serialize() Active
- https://git.drupalcode.org/project/feeds/-/merge_requests/205.diff from this issue
And then you should be good.
I see that on a site where I had this issue I'm still using the solution in #7.
I do plan to try to update that site to as much D11 compatible releases as possible, so perhaps I can test the changes from this issue along with it. But no promises, I noticed that when I tried to do composer update
now, quite a few patches could no longer be applied. It depends on how much time it will take to update these if there will be any time left to look at this issue.
Yes, this is possible for all config.
Example for overriding configuration for Commerce Abandoned Carts in settings.php:
$config['commerce_abandoned_carts.settings']['testmode'] = TRUE;
$config['commerce_abandoned_carts.settings']['testmode_email'] = 'foo@example.com';
Thanks for merging!
Yes, I've seen the Drush issue. I do have worked on Drush commands in the past, but I'm not up to date yet with the latest API and best practices. I could try to write tests for the commands, however - if I can find enough time for it.
On the Commerce Abandoned Carts settings you can figure a test mail address. Or you can disable test mode. Do one of these and that should resolve this issue.
If it doesn't resolve your issue, feel free to reopen this issue.
Because tests are failing, I added a task to the list that says "Try to adjust the code or the tests so that they pass.".
I added a list of tasks to the issue summary that could help with fixing this issue.
I added a list of tasks to the issue summary that could help with fixing this issue.
Updated the remaining tasks.
Updated issue summary.
Alright, good idea to have this issue separate for now :)
@rudi teschner
Cool, thanks for testing. I merged the code!
It sounds like this could be related to or a duplicate of โจ Add support for remote file systems for http downloaded data Active .
The duplicate entry issue (for the feeds_clean_list table) could be a completely different issue, however.
That's interesting that a MacBook going in sleep mode could affect the import from hanging. For imports in the UI I think that makes sense (if the webserver is running on the MacBook), but I would expect that with imports on cron, the import would eventually continue.
I have a Mac too. Would be interesting to run a large import and then at the exact moment that cron is running, put the Mac into sleep mode. And then see if that makes the import hang.
I've updated the issue summary and added a list of tasks remained to be done.
If you can figure out why it hangs and then resolve that issue, then you could start the import from scratch. But if it is going to hang once and you don't get that issue resolved, it will likely hang again on a restart of the import. So in that situation I would choose to remove the items that were already imported from the CSV file.
The provided fix in the MR should be fixing this issue. ๐
It would be great if you can test it, I plan to merge it this Thursday.
- I don't have experience importing a CSV file with that many lines, but I would choose to import in background and let the import be done in chunks using cron.
- I don't know. There could be a bug (either in Feeds or an other module) causing the import to stop. If this is the case, you should be able to find something about it on the server logs (the error may not be logged by Drupal). An other possibility is that the server thinks "This process runs for way too long, I'm going to stop it". Or a lack of memory. Probably these are reported on the server logs too.
- If you unlock the feed and then restart the import, the import will start from the beginning. If you have configured a CSV column as unique, Feeds would skip items it already has imported, but it will still go through each item in the file in order to check that. Say the import hangs at 2000 items, and you restart the import, then for the first 2000 items of the CSV file Feeds will check if they are already imported and would see that this is the case (and not import them again). But just doing these checks can also take a long time.
- Yes, setting cron running once an hour would work, but I would try to run cron more often. If for example Feeds would manage to import 500 items per cron run, then it would take 648 hours to import all 324000 items. That's almost a month. If you want to stop the cron import, then you would unlock the feed.
- An import running in the UI (where you see a progress bar) stops shortly after you close the browser (it would just finish only the last chunk it was busy with). An import running on cron does not depend on the browser. You can restart the import by unlocking the feed and then start the import again.
- Yes, an import running on cron does not depend on the browser.
Since I don't know what makes the import hang, I cannot guarantee that the import of all 324000 items will be successful when imported during cron. If it happens to hang, I would first check the server logs to see if there's any information what made it hang. Then I would check how many items were already imported and remove that many items from the CSV file and try again.
You could see that the import hangs if the amount of imported items stays the same after a cron run.
By installing the Queue UI โ module, you can inspect/monitor the import tasks that are scheduled to run. If the same task is retried over and over again, then the import hangs. (One day I hope to add functionality to Feeds that would detect that the same task is retried over and over again, so that it can warn the user that something went wrong during the import - or maybe even skip the task so it can continue doing the rest of the import.)
@anybody
Thanks for your compliments! ๐
I'm pleased as well that I could get this one finished. Up to the next thing!
The "Delete" button on a feed, deletes the feed itself. Not the import items.
Ah, I see you just updated the issue summary. Feel free to add/update the documentation โ . :)
- Active checkbox: you can configure feed types to import sources regularly. This is called "Periodic import" (see image). Only feeds that are active will be used for periodic import. So when unactivating it, this particular feed will no longer be imported regularly, only when you click "Import" or "Import in background".
- When an import for a feed starts, the feed gets locked. This is to prevent running another import for it. If two imports for the same feed would run at the same time, that could cause issues, for example it could make earlier imported items being deleted that should not be deleted (if you have configured to delete previously imported items that are no longer in the feed). Unlocking the feed you would do if you believe the import got stuck. You can then restart the import. When unlocking, Feeds cleanups the metadata for the import that did not finish.
- If you want to start the import over, first unlock the feed. Then you can delete all imported items.
- "Delete items" only deletes items that are created or updated with this feed. It doesn't delete items from other feeds - if those items are not created or updated with this feed. It is technically possible to configure feed types so that two feeds update the same content. Say feed 1 creates items A, B and C and feed 2 creates items D and E and updates item C. Then "Delete items" on feed 1 would delete items A, B and C and "Delete items" on feed 2 would delete items C, D and E. So in this case they would both delete item C, because they both "touched" item C.
A tip for importing large files: I think it is a better idea to import these in background (by using the "Import in background" button). This way, the import runs in chunks during cron runs. This way the chance that the import will hang is smaller, because it doesn't depend on the browser being kept open. It can still hang or get stuck however. For example when a fatal PHP error occurs in the process, or when the server shuts down. Or perhaps when running module updates (because that could cause module files temporary getting removed and that could possibly cause fatal PHP errors too).
Import in background does require cron to be configured. Per cron run, the import process runs for about a minute. So I can imagine 324000 lines would take quite a large number of cron runs too.
I hope this answers your questions. Feel free to reopen this issue if you have more questions. :)
Merged!
I merged the changes. Thanks all!
Looks good! Merged.
I don't understand your issue. I tried to reproduce it in the following way:
- Created a feed type with the HTTP Fetcher (download from url) and the CSV parser.
- Mapped to "title" to node title and "body" to the body field.
- Created a feed, I noticed that delimiter was set to
,
(comma), changed it to;
(semicolon) - Imported the following source:
title;body Foo;Bar
And the content imported fine.
So perhaps there's a specific plugin or module that disrupts the process?
Perhaps it is useful to use the affinity design file from #3459384-43: Project Browser: Create a logo for Feeds โ or the SVG from #3459384-40: Project Browser: Create a logo for Feeds โ .
I would like the strokes of the bear to be as thick as on the Feeds logo:
This is the logo we added to the project. The quality was set to 90 and the size has become 20k.
The following command was used to downsize the PNG from the afdesign export.
pngquant Feeds.png --force --quality 90
I provided a possible fix in the MR. I did not test it yet though. I'm still working on the automated tests.
If you want to apply the code to Feeds 8.x-3.0-rc2, you also need to apply the code from ๐ Error: Serialization of 'CurlHandle' is not allowed in serialize() Active .
Because there is now an issue related to this one opened whose fix would cause overlap with this one, I merged the code. Feel free to reopen this issue if it didn't completely fix the problem for you.
The related issue: ๐ Error: Call to undefined method Drupal\feeds\State::messenger() Active
Thanks, I hope I can review this one soon.
@bobknocker
That's a pity that you run into this, but I don't know why that happens. If you are on Drupal Slack, maybe could ask in the #composer channel?
Tagging with "multilanguage"
What kind of syntax is this? Anyway, I don't know if there already is a module that can parse this format.
Since the format looks similar to XML, if you want to write a parser yourself, I recommend to create a FeedsParser plugin with a class that extends \Drupal\feeds_ex\Feeds\Parser\ParserBase
. That class is part of
Feeds Extensible Parsers โ
, so that one would be your module's dependency.
Marking this as "Fixed" because I gave an answer. Feel free to reopen if you need more information or help.
I've opened โจ Display CSV headers in textarea when creating a new feed Active , so this feature can be tested in combination with the Feeds Textarea Fetcher โ module.
I think that I completed all of the listed remaining tasks, but to be sure it would be good to check that.
megachriz โ created an issue.
In September, @irinaz said in Slack that she wanted to help on the logo:
https://drupal.slack.com/archives/C34CECZAL/p1725818501416959?thread_ts=...
So I'm assigning the issue to her for now.
We'd like to go for something like the following logo:
The edit file as SVG is in the zip that was posted in #40:
https://www.drupal.org/files/issues/2024-08-22/Feeds-edited.svg_.zip โ
But here is the Affinity Designer document as well:
https://www.drupal.org/files/issues/2024-11-14/Feeds-edited.afdesign.zip โ
Remaining task: For the logo it should be checked if the documents has the right proportions and perhaps we should check if the strokes are thick enough. So some checks and maybe little tweaks.
smustgrave โ credited megachriz โ .
I created
โจ
Add a setting for limiting the amount of times a lock may be extended
Active
If someone wants to take a first step implementing a bit of it, that would be appreciated. ๐
megachriz โ created an issue.
I think it should default to the smallest effect on users. Thus the best would probably to use a big time frame like 12 hours. There will be few users that have a feed running longer. I would assume that these users rather check the settings description carefully and change the settings to their needs.
Alright, that sounds reasonable. This issue reported in the issue summary looks to be more about that Feeds never stops retrying fetching when it encounters a 404. While that is a possible cause for a feed getting stuck and as a result locks keep getting extended, maybe we would need to handle implementing lock settings in a separate issue?
You don't have patches applied to Feeds, by any chance?
Thanks for reporting. I think that the error is related to incompleted imports at the time of the update.
Only weird thing is that it says "undefined method". I would have expected the error to be something like "cannot call addMessage() on null" or something similar.
@thirstysix
Can you check if you see something suspicious in the error_log on your server?
โจ KeywordFilter: more config pre-processing from form validation to tamper execution Active is done! So here is a MR that will make KeywordFilter to throw a SkipTamperItemException instead of emptying a value if the data does not contain one of the configured keywords.
Hopefully, the MR passes tests.
Thanks for following up so quickly! I merged the changes.
Thanks for giving some background as well. At first I thought about if it would be useful to add token support to Tamper plugins, but since the Tamper module would be missing context (it doesn't get entities passed for example), maybe token replacement is better handled externally. I think updating/altering the config used at runtime is good enough, as long as you don't store that updated config. But I assume that you have handled that correctly.
I see that in ๐ Properly handle arrays in tamper plugin config Active there is a special case for the "find_replace_regex" Tamper plugin. I wonder if it would be a good idea to use a different data type in the config schema for tamper.find_replace_regex.find, but on https://www.drupal.org/docs/drupal-apis/configuration-api/configuration-... โ I didn't see a logical subtype of string to use. We could define our own subtype and call it "regex" for example. This is unrelated to this issue, but a thing that crossed my mind.
Here is my proposal for the field descriptions:
Retroactive update
Move and rename previously uploaded files. After saving the field settings, the paths of all previously uploaded files will be updated immediately. This will only occur once. So after the operation is done, this option will be disabled again.
Warning: This feature should only be used on developmental servers or with extreme caution.
Active updating
Actively move and rename previously uploaded files as required. If necessary, the paths of previously uploaded files are updated each time the entity they belong to gets saved.
Warning: This feature should only be used on developmental servers or with extreme caution.
megachriz โ made their first commit to this issueโs fork.
Yes, the build form already converts an array of words to a string that is usable for the textarea. I put this back to "Needs review", so you can give it another test. Thanks in advance. ๐
Perhaps good to know: in
๐
Make Keyword Filter to skip an item instead of emptying the value
Active
I plan to make another change to the Keyword Filter, namely that it will throw a SkipTamperItemException instead of emptying a value. This is consistent with how the Required plugin works, which also can throw that type of exception, so I expect it would not cause new issues.
But I plan to make that change after this issue is done, because it would else cause merge conflicts.
My reasoning for no longer use the "words" setting:
- I think that it is better to save a list of words only once, instead of having it in two formats.
- The
tamper()
method did not use the "words" setting (now it could use it, but only for backwards compatibility and only if the new setting "words_list" is empty). - The
tamper()
method wants to have an array of words.
Converting the list back to words isn't simple either.
buildConfigurationForm()
easily converts an array of words to something usable for a form:
'#default_value' => implode("\n", $this->getWordList()),
So upon submitting the form, the words inputted in a textarea are converted to an array. And when loading the form, the array is then converted back to a string.
@jurgenhaas
Does this cause any issues for you? Would you still need a processConfigValues()
method?
@jurgenhaas
Note that the "words" setting (where words are saved as string) is only respected for backwards compatibility. When (re)saving the configuration, the setting will become an empty string.
Example
Before (configuration in the old format, from a feed type):
words: |-
foo
bar
word_boundaries: true
exact: false
case_sensitive: false
invert: false
word_list:
- /\bfoo\b/ui
- /\bbar\b/ui
regex: true
function: matchRegex
uuid: 81082ad2-4c8a-44c8-aeff-2969eb3f9612
plugin: keyword_filter
source: qux
weight: 0
label: 'Keyword filter'
After (configuration in the new format, from a feed type):
words: ''
words_list:
- foo
- bar
word_boundaries: true
exact: false
case_sensitive: false
invert: false
uuid: 81082ad2-4c8a-44c8-aeff-2969eb3f9612
plugin: keyword_filter
source: qux
weight: 0
label: 'Keyword filter'
I also fixed some phpcs and cspell issues.
megachriz โ created an issue.
I'm testing this myself. One thing that I think is not correct is that if you have the permission to import your own feeds, but not feeds you do not own, you can still access the CSV template of other feeds. Also, I think it makes sense that you should also be able to access if you can create or update feeds, even if you cannot import them.
What I'd like to change as well is the path for the feed specific template. Currently this is /feed/template/{feeds_feed_type}/{feeds_feed}
, but I think it makes more sense if it was /feed/{feeds_feed}/template
instead. This requires also additional access control and tests.
The feed type specific template can stay to be /feed/template/{feeds_feed_type}
So, work to be done!
I see: group relationships should get cleaned up upon group removal. Perhaps it is caused by a local issue then. Or maybe I accidentally patched the wrong file? Because I see that my MR is different from the patch in #3.
Good call on requesting the steps to reproduce the issue first.
@anybody
Thank you for testing and for your compliments! ๐
Now that I'm mostly done porting my projects to Drupal 11 (I decided to take a break on that for the remaining) and all Feeds stable release blockers have been resolved, I can finally pick some issues that I had put on the wait list. So I sort of randomly picked up this as it looked like it could be resolved in a relative small amount of work hours.
Tests are passing!
One thing I'd like to see added is a method for returning the raw template contents - as a string. This can then be used by Feeds Textarea Fetcher โ . With that module you can put in the source data in a textarea. It would be nice that in case of the CSV parser, it would already display the CSV columns. And that is only possible if Feeds Textarea Fetcher can load the raw template contents.
For now I set this to "needs review" because it is ready to be tested.
I found this example for combining values using Xpath: https://scrapfly.io/blog/how-to-join-values-in-xpath/
So it looks like that the Xpath that you need is:
concat(bc:location/bc:street, " ", bc:location/bc:number)
Upon further looking at the stack trace, the array_flip()
error did not happen in \Drupal\feeds\Entity\FeedType::__sleep()
, but in \Drupal\Core\Entity\EntityStorageBase::loadMultiple()
instead:
/**
* {@inheritdoc}
*/
public function loadMultiple(?array $ids = NULL) {
$entities = [];
$preloaded_entities = [];
// Create a new variable which is either a prepared version of the $ids
// array for later comparison with the entity cache, or FALSE if no $ids
// were passed. The $ids array is reduced as items are loaded from cache,
// and we need to know if it is empty for this reason to avoid querying the
// database when all requested entities are loaded from cache.
$flipped_ids = $ids ? array_flip($ids) : FALSE;
The error looks like to be triggered by either an entity reference field or a file field. While theoritically this could come from a bug in Feeds, it is not clear if this is the case. Since this issue is over 3 years old now, I think I'm just going to close it. Feel free to reopen if you still encounter this issue and if you can provide the steps to reproduce the issue.
Note: I did merge a change for FeedType::__sleep()
, because the double usage of array_flip()
was not needed.
I plan to work on this, have drafted some locally, but first โจ KeywordFilter: more config pre-processing from form validation to tamper execution Active needs to land.
I've merged ๐ Keyword filter: variable executed as function, security issue? Active which should fix part of this issue.
Here I went further by removing all config processing in validateConfigurationForm()
. Now that method only validates. There is still a bit of config processing left in submitConfigurationForm()
and that is that the list of words as inputted on the form is converted to an array. For example: "Foo\n\Bar\nQux"
is saved as ['Foo', 'Bar', 'Qux']
.
Summary of changes:
- Deprecated 'words' setting;
- Removed 'word_list' setting;
- Added 'words_list' setting as a replacement for 'words' and 'word_list';
- Determine regex to use when applying the Tamper plugin and not when saving the config;
- Added backwards compatibility for the 'words' setting.
I drafted a change record that also includes the changes from
๐
Keyword filter: variable executed as function, security issue?
Active
:
https://www.drupal.org/node/3485191 โ
@jurgenhaas
Would this work for you? Or would you still need something like processConfigValues()
, basically just for converting "Foo\n\Bar\nQux"
to ['Foo', 'Bar', 'Qux']
?
Merged the changes!
I know that multilingual support in Feeds is not perfect yet, but since I don't have clients with multilingual sites currently, I prioritize other Feeds issues as of now. Unfortunately, I don't know a good workaround for this issue. Perhaps making 'Authored by' translatable or using a separate feed type for this field? I would need to dive in deeper into this topic in order to think of better workarounds.
Note: the provided link results into "Preview Not Found". Was it only available for a very short amount of time?
Tagging issue with "multilanguage", so I can find this issue back as soon as I plan to focus on Feeds multilingual issues again.
This could be a Composer issue. I don't know a lot about fixing Composer related problems. One thing that helped me sometimes when I faced a Composer related issue like this, is deleting the vendor library ("laminas" in this case) and the module directory ("feeds" in this case) and then try again with either composer install
or composer update drupal/feeds
.
Note that on the automated test runner, Feeds installs fine with laminas/laminas-feed (2.23.0), so that's why I think it is more likely to be a Composer issue than a Feeds issue. See https://git.drupalcode.org/project/feeds/-/jobs/3183360
To be certain, the "laminas/laminas-feed" is installed the site's vendor directory? Thus in site/vendor, not in site/web/modules/contrib/feeds/vendor?
I'm not sure if I want to require Drupal 10.2 yet, so because of that it might be useful to add test coverage for this.