Even after placing Gutenberg main module to the custom module folder, (and remove customization in the contrib folder), I still have broken UI.
keithlee_giai → created an issue.
Well.. I have changed MinIO setting to virtual host so that it honors Amazon S3's default URL structure.
Thought it should work, so I activated S3FS again on my test site, a copy of live website.
Then, once I activate S3FS, all media files turn to blank with aforementioned URL structure, like mydomain.com/s3/files/....
Hoping that simpler configuration may re-write the URL, I tried with another batch run for 'Copy Local Files to S3'.
Unfortunately, it does not work. Tried both GUI and command line.
As before, most image files are uploaded to desired bucket in S3, but images under /style/ folder are only partly uploaded and the URL is broken.
I feel like there is a chunk of cache left in DB that is enabled when I activated S3FS again.
Can you plz help me to purge that cache and restart the bulk image upload?
At least, when I re-activate S3FS, the images in /contents/media should still be visible.
If it is not about the existing DB cache, I am really puzzled. With the virtual host set up, it really is the same as Amazon S3, except the fact that the endpoint url is not Amazon's native one.
FYI, with the virtual host, all other open sources that I am working on creates zero issue. Except the fact that I have custom endpoint, Nextcloud, Discourse, Moodle, and WordPress work smoothly.
@cmlara
It seems like AWS SDK / MinIO issue, or to some extent, due to my mis-configuration.
- https://meta.discourse.org/t/s3-cdn-url-w-bucket-name-minio/338667
Above link is my experience with MinIO integration with Discourse where I also mention prbbly the same issue with NextCloud.
Your comment that the current thumbnail image url is generic S3 url when S3FS cannot find the relelvant information helped me to see what's going on.
For Drupal, it is basically the same for Nextcloud and Discourse that I have to use internal IP/Port for API and use CDN w/ full bucket name as for the matching URL.
For now, I will compromise with the workaround, but when AWS SDK is updated or MinIO fixes the issue, I might have to re-configure the whole set up.
Anyways, thank you for the support. @cmlara.
For anyone with MinIO multi server, I hope my struggle help to figure out the issue.
@cmlara, thank you for the update.
I have not seen any error msg displayed on the screen. Perhaps it is because I have not enabled 'verbose' option in settings.php.
I will check again with another test site later on.
When it comes to debugging, since it did not work on the website, I did try with command line option with drush, but it also failed with exactly the same situation. And, I have not seen any error log during image migration. The only thing I saw during the command line operation was completion % growing from 0 to 100%.
The command line option did not honor the domain module's domain source, so all images got uploaded to main site's S3, but the uploaded image file numbers match between command line and web UI cases.
I did have timeout case once, but that was 1 out of 10 cases, and it happened only when I tried it on the website.
I have not checkec with ::class() and ::dirScan() as I am not aware of such options. I will also check them on the test site w/ 'verbose' on.
For now, I am more concerned about incompatibility as for Nextcloud and MinIO. Can you please help me to understand how S3FS establish connection btwn Drupal and S3? For Nextcloud, it uploaded files if it was connected as an external drive but not as a main drive, both of which rely on different ways to integrate S3. For S3FS, since most files are uploaded properly, except thumbnails, I suspect a configuration mismatch with integration method.
@ressa, as much as I respect your suggestion and this whole Drupal dev community, after seeing what this module does to the test site, I was really afraid of losing money again on dev. I have lost $$$ to dev from a variety of platforms and projects, including in house services, and unfortunately given that WP migration module has been maintained over 10+ years in Drupal community, while WordPress's DB handling has almost not changed at all, if it still has these many bugs for basic functions, it was really hard for me to trust any dev on this module.
Sorry to be really harsh on commenting like this, but as for a 'wannabe' dev friendly person as in that I do frequently sponsor open source module customizations and sometimes fix minor bugs / version compatibility issues by myself, after finding out that 'public://' was the very cause of my lost hours, it was hard to convince myself to ask for further developements. It might not make sense from dev's point of view, but as a project manager with limited successful project experience, your current code and code upgrade history is the only two indicators of the dev's capability.
Besides, from dev's point, every additional function may be seen as an incremental improvment, but for business people who pays for it, it is mostly 0 or 1. Partial solution still requires human back up, which means hiring relevant people, train them with relevant information, and monitor the outcome. Unless it is a complete solution, it really isn't much of a 'solution'.
I will probably never need this module again, but for other people suffering from WP's limited capability, I do hope that they get benefited by this migrator's massive upgrade.
instead of node_xxxx datatables, the module gives me an option to create my own name of node's datatables.
And, for featured images and custom posts, each had it's own node type, so after migrating 3 custom post types from WordPress, I had 6 new node-like datatables, but slightly different structure from Drupal's native node.
Because of the difference, many Drupal modules were incompatible.
Besides, the module did not migrate other key values of WP posts, such as category, SEO summary, keywords, and etc.
In the end, I paid a few thousand bucks to copy and paste each post by human.
I appreciate the volunteer group's hard work on this module, but as for a long time WP user, I have almost given up migrating to Drupal largely because of this module's limited capability. I had no alternative due to my company's situation, but not that many people will go for human migrator and pay $$$.
If Drupal wants to grow user base, I think this is the very first module that has to be massively upgraded.
Bulk dating - In the 'Actions' tab, there is an option called 'Copy Local Files to S3'
I set 'Always', and clicked the 'Copy local public files to S3'. It uploaded all my existing files.
Based on your comment, it is because the thumbnail images are not uploaded to my S3.
Indeed many of them are missing, but it is not like 0%. I still have 20~30% of thumbnail images are available in the S3 bucket.
I still cannot understand why most thumbnail images were failed to be uploaded.
Perhaps due to mis-configuration?
At least, out of 15,000 images, about 10,000 images are uploaded. Most failure comes from the thumbnail image folders that are under style/image_type/ (where image_type is thumbnail). Since most thumbnails are not uploaded, and all URLs are with mydrupalurl.com/s3/files/styles.... instead of my S3's bucket name, all thumbnails in Contents -> Media are currently blank.
Given that other images are properly uploaded and all images that I use for ad link are uploaded w/ proper S3 link, I doubt any mis-configuration is an issue.
To above any url issue, I also have enabled 'Enable CNAME' and added the cdn.example.com (identical to 'Use a Custom Host').
Might not be related to this case, but in case this may help you.
I have had trouble integrating my MinIO S3 to Nextcloud, when I use aformentioned CDN url. I run my MinIO multi server, each of them has internal IP as the end point, like 192.168.1.1:9000 and 192.168.1.2:9000 with the CDN name as the public end point at proxy server. I usually don't have to use the internal IP, but for Nextcloud, it failed to connect with cdn.example.com. I only was able to have Nextcloud to use MinIO S3 w/ internal IP. Guess it is because Nextcloud does not use URL to find the S3 instance, but with API? I was lost after reading some artilces, and I don't use Nextcloud anymore.
keithkhl → created an issue.
keithkhl → created an issue.
keithkhl → created an issue.
Just tested out the overnight dev version. confirm that it works
Just tested out the overnight dev version. I can confirm that a new OpenID user role is now mapped to the existing user account.
I think the issue can be closed, once the dev version rolls out.
Does this update help auto-mapping new openID users to existing user's roles?
The default choice options myseriously appeared this morning. You can close this ticket.
Related to this request, I wonder if having more than 1 domain source will create any issue.
I thought domain source is hard url, while domain access is a sort of alias.
So, as long as I keep a single source (and single sitemap component), I won't be damaged by duplicated contents penalty by Google.
I can see that I can have more than one default value for domain source. I was going to pick as many sources as possible, but then the duplicatd contents penalty hit me.
This might be a simple googling or chatgpt level question, but would like to have Drupal guru's opinion.
keithkhl → created an issue.
keithkhl → created an issue.
@s3b0un3t, I can make it work by commenting out __construct in the function temporarily, but indeed it occurs to all modules relying on Drupal core's configform.
can confirm that @johnjw59 's patch works on 2.0.0-beta2 and the Oct dev version.
It would be ideal to have an option to choose which domain to be the default, though.
https://www.drupal.org/project/change_author_action →
Change author action module worked for me.
keithkhl → created an issue.
keithkhl → created an issue.
keithkhl → created an issue.
I just found that in layout builder edit mode -> Manage attributes -> Block / Block title / Block content attributes, all fields (ID, Class, Style, Data-*attributes) are empty. Do I have to specify what values should be pulled?
I am new to Drupal, and I am really new to Layout Builder, so it is only a guess, but for other modules that deal with block layout, whatever the setting I make in Drupal's native UI (ex. structure -> contents type -> Edit / Manage display), no settings were effective. I had to re-define any value in Layout Builder, especially the ones with 'Manage attributes'.
Just wonder if this might be related to Flippy not picking up any node. If so, can you plz help me what value I should use for Data-*attributes? (ID/Class/Style should follow typical HTML grammar, I assume.)
keithkhl → created an issue.
keithkhl → created an issue.
keithkhl → created an issue.
Just double checked that after removing the default domain, I can access the /admin/content page again. Now, the main page (www.example.com) is not accessible.
keithkhl → created an issue.
This happended to me as well after setting up default domain in domain module. Withouth default, I don't, but then, the main site is not accessible. Wonder if there is any workaround at least...
keithkhl → created an issue.
By default, all images were set to be converted to WebP for multiple sizes, but since the image was already WebP, the conversion process didn't work.
keithkhl → created an issue.
keithkhl → created an issue.
I thought I installed both extensions to my Ubuntu.
I just changed the base module from gd to imagemagick by image tool kit. problem solved.
keithkhl → created an issue.
With vanilla installation, I manage to add avif in the allowed image list in above two structure settings, but I stil cannot upload avif image files. When I upload *.avif to node editor's image, I get following error
- The specified file ****.avif could not be uploaded.
- The image file is invalid or the image type is not allowed. Allowed types: avif, avifs, bmp, dib, gif, jfif, jpe, jpeg, jpg, png, svg, svgz, tif, tiff, webp
It is surprising that even if I have the MIME type in the available list, I still end up with the same error.
@gord.fisch, that was file path module.
It created 'public://' at the front of urls for image, which nullified url rules that wp migrator follows.
I was going to use this plugin, but not any more. It creates db tables that are not in native structure of Drupal. I might have misconfigured, but that's already a big no go. migrating custom post types of wp to drupal's new contents type didnt work neither. image urls were not properly changed, and some images missed drupal's uuid creation.
I could debug and fix one by one, but i would rather pay a few thousand bucks to human migration. this will be way less prone to errors.
Sorry to bother you guys again for this fixed issue, but I wonder if there is a patch available.
I am having the same error on D11 w/ Gutenberg 3.0.4 (and dev) when I upload images directly by 'upload' button.
If I do so on the media library pop-up screen, I do not have such issue.
It would be great to have the patch available before your next minor version roll-out, if it is going to take some time.
@gease, I fixed it. Don't know why, but dependency to 'claro' did not read the theme. I added the full function into the code. So, as long as Claro is not removed, the function should work. I've confirmed a number of admin themes (Gin admin, Bootstrap admin, and D10's administration theme). Attached the 'dirty hack', in case anyone get in trouble like me.
@gease, I've tried the function as a custom module on Gin Admin and Bootstrap 5 admin themes, but it doesnt work on either. I revert back to Claro for now.
To the team, as a long time WordPress user w/ heavy dependence to Gutenberg, I deeply appreciate the hard work on this. However, with current setting, I become wary of every single module potentially discharge me from node editor with Gutenberg.
Whatever the functional mod to the admin theme is implemented, like experimental navigation bar module for example, I cannot use Gutenberg editor anymore.
I can further customize CSS on a Claro sub theme for UI, but I would much appreciate if there is an official sub module that overrules each theme's node editor customization. At least it was like that with WordPress by default.
can you guys commit a WIP update at least as a dev version? including above issue, I remember reading a number of bugs being patched, but I am still stuck with manually installed 3.0.4 , which creates version incompatibility msg everytime I want to install gutenberg sub modules via composer.
Create a new form designated for sidebar, and in manage form display, add everything you want under the sidebar form.
It still does not remove the revision box. I either have to remove revision option in Edit, or use css to make the box invisible.
Now need to find out how to create a sub theme for Claro.
keithkhl → created an issue.
@gease, thx bunch. haven't thought about that. as long as claro stays in the system, like sub-themes.
It's because the exmple blocks module within 3.0.4 package is not compatible with D11
same here. 3.0.4 is locked from D11. Manual installation worked though.
After setting a Bootstrap theme for default, I cannot access editor even with Claro as the admin theme (w/ editor option checked).
In the dev console, I can see that calls for files are made, and status were all 200, but the screen body got covered by the Bootstrap js.
The entity type agnostic (
https://www.drupal.org/project/gutenberg/issues/3445655
📌
Entity type agnostic
Active
) should take this into account.
Because I have removed SASS starter kit, after I created my own custom sub-theme.
Once I install the theme again, the issue is gone.
Where to add that function? Guess I have to change the library 'claro/form-two-columns' to something else.
keithkhl → created an issue.
I further uninstalled most uninstallable modules, and now it is working. I will later add what module or customization caused the issue.
I have uninstalled following modules, which removed above database connection error, but I still cannot run this migration. Despite the error msg, xml files are indeed uploaded, but it is stuck there.
- Migrate Drupal
- Migrate Drupal UI
keithkhl → created an issue.
Thx for the comment. Yup, Rome, do it as Rome. I got that. Guess this type of a little harsh comment is also Drupal style. Besides, I should've raised it to Gutenberg module board, not here. My bad.
Just so you know, there are over 5,000 Drupal websites running Gutenberg, based on the module page stat. It has not been Drupal style that I get that, but the size does not look that negligible, considering all other module stats that I have seen for the past a week.