Apr 24 14:23:29.277127 ip-10-0-128-36 systemd[1]: Starting Kubernetes Kubelet... Apr 24 14:23:29.775413 ip-10-0-128-36 kubenswrapper[2567]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 14:23:29.775413 ip-10-0-128-36 kubenswrapper[2567]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 14:23:29.775413 ip-10-0-128-36 kubenswrapper[2567]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 14:23:29.775413 ip-10-0-128-36 kubenswrapper[2567]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 14:23:29.775413 ip-10-0-128-36 kubenswrapper[2567]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 14:23:29.779127 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.778969 2567 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 14:23:29.786772 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786741 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:23:29.786772 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786766 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:23:29.786772 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786770 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:23:29.786772 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786775 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:23:29.786772 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786779 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:23:29.786998 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786783 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:23:29.786998 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786786 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:23:29.786998 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786789 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:23:29.786998 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786792 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:23:29.786998 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786795 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:23:29.786998 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786798 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:23:29.786998 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786800 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:23:29.786998 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786824 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:23:29.786998 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786829 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:23:29.786998 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786832 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:23:29.786998 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786835 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:23:29.786998 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786838 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:23:29.786998 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786840 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:23:29.786998 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786843 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:23:29.786998 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786852 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:23:29.786998 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786855 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:23:29.786998 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786857 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:23:29.786998 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786860 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:23:29.786998 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786862 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:23:29.786998 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786865 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:23:29.787449 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786867 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:23:29.787449 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786870 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:23:29.787449 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786872 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:23:29.787449 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786875 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:23:29.787449 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786877 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:23:29.787449 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786881 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:23:29.787449 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786883 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:23:29.787449 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786886 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:23:29.787449 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786889 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:23:29.787449 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786891 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:23:29.787449 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786893 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:23:29.787449 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786896 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:23:29.787449 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786898 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:23:29.787449 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786902 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:23:29.787449 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786904 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:23:29.787449 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786908 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:23:29.787449 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786910 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:23:29.787449 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786912 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:23:29.787449 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786915 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:23:29.787449 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786918 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:23:29.788004 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786926 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:23:29.788004 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786929 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:23:29.788004 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786932 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:23:29.788004 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786934 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:23:29.788004 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786937 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:23:29.788004 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786940 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:23:29.788004 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786943 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:23:29.788004 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786945 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:23:29.788004 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786947 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:23:29.788004 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786950 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:23:29.788004 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786952 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:23:29.788004 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786955 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:23:29.788004 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786957 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:23:29.788004 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786960 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:23:29.788004 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786962 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:23:29.788004 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786965 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:23:29.788004 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786967 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:23:29.788004 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786969 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:23:29.788004 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786972 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:23:29.788004 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786975 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:23:29.788466 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786977 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:23:29.788466 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786979 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:23:29.788466 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786984 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:23:29.788466 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786988 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:23:29.788466 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786990 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:23:29.788466 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786994 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:23:29.788466 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.786998 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:23:29.788466 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787001 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:23:29.788466 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787003 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:23:29.788466 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787006 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:23:29.788466 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787008 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:23:29.788466 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787011 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:23:29.788466 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787013 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:23:29.788466 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787021 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:23:29.788466 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787024 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:23:29.788466 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787026 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:23:29.788466 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787029 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:23:29.788466 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787031 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:23:29.788466 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787034 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:23:29.788919 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787037 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:23:29.788919 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787039 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:23:29.788919 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787472 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:23:29.788919 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787478 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:23:29.788919 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787481 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:23:29.788919 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787483 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:23:29.788919 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787486 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:23:29.788919 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787488 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:23:29.788919 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787491 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:23:29.788919 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787494 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:23:29.788919 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787496 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:23:29.788919 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787500 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:23:29.788919 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787502 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:23:29.788919 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787504 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:23:29.788919 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787507 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:23:29.788919 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787510 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:23:29.788919 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787512 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:23:29.788919 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787514 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:23:29.788919 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787518 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:23:29.788919 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787521 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:23:29.789375 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787524 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:23:29.789375 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787526 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:23:29.789375 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787529 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:23:29.789375 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787532 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:23:29.789375 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787534 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:23:29.789375 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787537 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:23:29.789375 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787545 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:23:29.789375 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787548 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:23:29.789375 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787551 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:23:29.789375 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787553 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:23:29.789375 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787555 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:23:29.789375 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787558 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:23:29.789375 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787560 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:23:29.789375 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787563 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:23:29.789375 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787565 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:23:29.789375 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787568 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:23:29.789375 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787570 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:23:29.789375 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787573 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:23:29.789375 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787575 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:23:29.789375 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787578 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:23:29.789869 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787581 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:23:29.789869 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787583 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:23:29.789869 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787592 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:23:29.789869 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787596 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:23:29.789869 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787599 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:23:29.789869 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787602 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:23:29.789869 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787605 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:23:29.789869 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787607 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:23:29.789869 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787610 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:23:29.789869 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787612 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:23:29.789869 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787615 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:23:29.789869 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787618 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:23:29.789869 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787621 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:23:29.789869 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787623 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:23:29.789869 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787626 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:23:29.789869 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787629 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:23:29.789869 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787631 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:23:29.789869 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787633 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:23:29.789869 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787636 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:23:29.790321 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787644 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:23:29.790321 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787647 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:23:29.790321 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787649 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:23:29.790321 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787652 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:23:29.790321 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787654 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:23:29.790321 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787656 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:23:29.790321 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787659 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:23:29.790321 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787661 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:23:29.790321 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787664 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:23:29.790321 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787666 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:23:29.790321 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787668 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:23:29.790321 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787671 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:23:29.790321 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787674 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:23:29.790321 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787676 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:23:29.790321 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787679 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:23:29.790321 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787681 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:23:29.790321 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787683 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:23:29.790321 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787686 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:23:29.790321 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787688 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:23:29.790764 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787690 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:23:29.790764 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787693 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:23:29.790764 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787695 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:23:29.790764 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787698 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:23:29.790764 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787703 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:23:29.790764 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787706 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:23:29.790764 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787709 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:23:29.790764 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787712 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:23:29.790764 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787714 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:23:29.790764 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.787717 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:23:29.790764 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789363 2567 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 14:23:29.790764 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789373 2567 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 14:23:29.790764 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789384 2567 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 14:23:29.790764 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789388 2567 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 14:23:29.790764 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789399 2567 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 14:23:29.790764 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789403 2567 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 14:23:29.790764 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789407 2567 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 14:23:29.790764 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789412 2567 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 14:23:29.790764 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789415 2567 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 14:23:29.790764 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789419 2567 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 14:23:29.790764 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789422 2567 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 14:23:29.791290 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789425 2567 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 14:23:29.791290 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789428 2567 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 14:23:29.791290 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789431 2567 flags.go:64] FLAG: --cgroup-root="" Apr 24 14:23:29.791290 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789434 2567 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 14:23:29.791290 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789437 2567 flags.go:64] FLAG: --client-ca-file="" Apr 24 14:23:29.791290 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789440 2567 flags.go:64] FLAG: --cloud-config="" Apr 24 14:23:29.791290 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789444 2567 flags.go:64] FLAG: --cloud-provider="external" Apr 24 14:23:29.791290 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789447 2567 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 14:23:29.791290 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789454 2567 flags.go:64] FLAG: --cluster-domain="" Apr 24 14:23:29.791290 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789457 2567 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 14:23:29.791290 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789460 2567 flags.go:64] FLAG: --config-dir="" Apr 24 14:23:29.791290 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789463 2567 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 14:23:29.791290 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789466 2567 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 14:23:29.791290 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789470 2567 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 14:23:29.791290 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789474 2567 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 14:23:29.791290 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789477 2567 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 14:23:29.791290 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789481 2567 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 14:23:29.791290 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789484 2567 flags.go:64] FLAG: --contention-profiling="false" Apr 24 14:23:29.791290 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789487 2567 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 14:23:29.791290 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789489 2567 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 14:23:29.791290 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789493 2567 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 14:23:29.791290 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789495 2567 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 14:23:29.791290 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789500 2567 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 14:23:29.791290 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789502 2567 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 14:23:29.791290 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789505 2567 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 14:23:29.791896 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789508 2567 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 14:23:29.791896 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789519 2567 flags.go:64] FLAG: --enable-server="true" Apr 24 14:23:29.791896 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789522 2567 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 14:23:29.791896 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789528 2567 flags.go:64] FLAG: --event-burst="100" Apr 24 14:23:29.791896 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789531 2567 flags.go:64] FLAG: --event-qps="50" Apr 24 14:23:29.791896 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789534 2567 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 14:23:29.791896 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789537 2567 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 14:23:29.791896 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789540 2567 flags.go:64] FLAG: --eviction-hard="" Apr 24 14:23:29.791896 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789544 2567 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 14:23:29.791896 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789547 2567 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 14:23:29.791896 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789550 2567 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 14:23:29.791896 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789553 2567 flags.go:64] FLAG: --eviction-soft="" Apr 24 14:23:29.791896 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789556 2567 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 14:23:29.791896 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789559 2567 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 14:23:29.791896 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789562 2567 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 14:23:29.791896 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789565 2567 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 14:23:29.791896 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789568 2567 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 14:23:29.791896 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789571 2567 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 14:23:29.791896 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789574 2567 flags.go:64] FLAG: --feature-gates="" Apr 24 14:23:29.791896 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789578 2567 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 14:23:29.791896 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789580 2567 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 14:23:29.791896 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789583 2567 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 14:23:29.791896 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789587 2567 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 14:23:29.791896 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789590 2567 flags.go:64] FLAG: --healthz-port="10248" Apr 24 14:23:29.791896 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789593 2567 flags.go:64] FLAG: --help="false" Apr 24 14:23:29.792490 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789596 2567 flags.go:64] FLAG: --hostname-override="ip-10-0-128-36.ec2.internal" Apr 24 14:23:29.792490 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789599 2567 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 14:23:29.792490 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789602 2567 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 14:23:29.792490 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789605 2567 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 14:23:29.792490 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789608 2567 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 14:23:29.792490 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789612 2567 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 14:23:29.792490 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789615 2567 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 14:23:29.792490 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789618 2567 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 14:23:29.792490 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789620 2567 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 14:23:29.792490 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789635 2567 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 14:23:29.792490 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789639 2567 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 14:23:29.792490 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789642 2567 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 14:23:29.792490 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789645 2567 flags.go:64] FLAG: --kube-reserved="" Apr 24 14:23:29.792490 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789647 2567 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 14:23:29.792490 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789650 2567 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 14:23:29.792490 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789653 2567 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 14:23:29.792490 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789656 2567 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 14:23:29.792490 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789659 2567 flags.go:64] FLAG: --lock-file="" Apr 24 14:23:29.792490 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789661 2567 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 14:23:29.792490 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789664 2567 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 14:23:29.792490 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789667 2567 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 14:23:29.792490 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789677 2567 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 14:23:29.792490 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789680 2567 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 14:23:29.793037 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789683 2567 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 14:23:29.793037 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789685 2567 flags.go:64] FLAG: --logging-format="text" Apr 24 14:23:29.793037 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789688 2567 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 14:23:29.793037 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789692 2567 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 14:23:29.793037 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789695 2567 flags.go:64] FLAG: --manifest-url="" Apr 24 14:23:29.793037 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789698 2567 flags.go:64] FLAG: --manifest-url-header="" Apr 24 14:23:29.793037 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789703 2567 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 14:23:29.793037 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789706 2567 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 14:23:29.793037 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789710 2567 flags.go:64] FLAG: --max-pods="110" Apr 24 14:23:29.793037 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789713 2567 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 14:23:29.793037 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789716 2567 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 14:23:29.793037 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789719 2567 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 14:23:29.793037 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789722 2567 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 14:23:29.793037 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789724 2567 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 14:23:29.793037 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789727 2567 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 14:23:29.793037 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789730 2567 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 14:23:29.793037 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789738 2567 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 14:23:29.793037 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789741 2567 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 14:23:29.793037 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789744 2567 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 14:23:29.793037 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789753 2567 flags.go:64] FLAG: --pod-cidr="" Apr 24 14:23:29.793037 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789756 2567 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 14:23:29.793037 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789763 2567 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 14:23:29.793037 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789766 2567 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 14:23:29.793037 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789769 2567 flags.go:64] FLAG: --pods-per-core="0" Apr 24 14:23:29.793600 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789772 2567 flags.go:64] FLAG: --port="10250" Apr 24 14:23:29.793600 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789775 2567 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 14:23:29.793600 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789778 2567 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0c8b35a9a46203441" Apr 24 14:23:29.793600 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789781 2567 flags.go:64] FLAG: --qos-reserved="" Apr 24 14:23:29.793600 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789784 2567 flags.go:64] FLAG: --read-only-port="10255" Apr 24 14:23:29.793600 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789787 2567 flags.go:64] FLAG: --register-node="true" Apr 24 14:23:29.793600 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789790 2567 flags.go:64] FLAG: --register-schedulable="true" Apr 24 14:23:29.793600 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789793 2567 flags.go:64] FLAG: --register-with-taints="" Apr 24 14:23:29.793600 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789796 2567 flags.go:64] FLAG: --registry-burst="10" Apr 24 14:23:29.793600 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789799 2567 flags.go:64] FLAG: --registry-qps="5" Apr 24 14:23:29.793600 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789802 2567 flags.go:64] FLAG: --reserved-cpus="" Apr 24 14:23:29.793600 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789817 2567 flags.go:64] FLAG: --reserved-memory="" Apr 24 14:23:29.793600 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789823 2567 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 14:23:29.793600 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789826 2567 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 14:23:29.793600 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789829 2567 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 14:23:29.793600 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789833 2567 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 14:23:29.793600 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789836 2567 flags.go:64] FLAG: --runonce="false" Apr 24 14:23:29.793600 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789839 2567 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 14:23:29.793600 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789842 2567 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 14:23:29.793600 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789845 2567 flags.go:64] FLAG: --seccomp-default="false" Apr 24 14:23:29.793600 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789848 2567 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 14:23:29.793600 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789851 2567 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 14:23:29.793600 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789854 2567 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 14:23:29.793600 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789857 2567 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 14:23:29.793600 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789860 2567 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 14:23:29.793600 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789862 2567 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 14:23:29.794252 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789865 2567 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 14:23:29.794252 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789868 2567 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 14:23:29.794252 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789877 2567 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 14:23:29.794252 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789882 2567 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 14:23:29.794252 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789885 2567 flags.go:64] FLAG: --system-cgroups="" Apr 24 14:23:29.794252 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789888 2567 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 14:23:29.794252 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789894 2567 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 14:23:29.794252 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789896 2567 flags.go:64] FLAG: --tls-cert-file="" Apr 24 14:23:29.794252 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789899 2567 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 14:23:29.794252 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789906 2567 flags.go:64] FLAG: --tls-min-version="" Apr 24 14:23:29.794252 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789909 2567 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 14:23:29.794252 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789912 2567 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 14:23:29.794252 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789915 2567 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 14:23:29.794252 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789918 2567 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 14:23:29.794252 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789921 2567 flags.go:64] FLAG: --v="2" Apr 24 14:23:29.794252 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789925 2567 flags.go:64] FLAG: --version="false" Apr 24 14:23:29.794252 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789929 2567 flags.go:64] FLAG: --vmodule="" Apr 24 14:23:29.794252 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789934 2567 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 14:23:29.794252 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.789937 2567 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 14:23:29.794252 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790044 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:23:29.794252 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790048 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:23:29.794252 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790051 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:23:29.794252 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790054 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:23:29.794252 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790057 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:23:29.794833 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790060 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:23:29.794833 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790063 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:23:29.794833 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790066 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:23:29.794833 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790068 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:23:29.794833 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790071 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:23:29.794833 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790073 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:23:29.794833 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790076 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:23:29.794833 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790078 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:23:29.794833 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790081 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:23:29.794833 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790083 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:23:29.794833 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790086 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:23:29.794833 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790095 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:23:29.794833 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790098 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:23:29.794833 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790101 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:23:29.794833 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790103 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:23:29.794833 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790107 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:23:29.794833 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790111 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:23:29.794833 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790113 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:23:29.794833 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790116 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:23:29.795322 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790119 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:23:29.795322 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790122 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:23:29.795322 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790126 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:23:29.795322 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790129 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:23:29.795322 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790131 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:23:29.795322 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790134 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:23:29.795322 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790136 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:23:29.795322 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790139 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:23:29.795322 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790141 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:23:29.795322 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790144 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:23:29.795322 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790146 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:23:29.795322 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790149 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:23:29.795322 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790152 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:23:29.795322 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790156 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:23:29.795322 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790159 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:23:29.795322 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790161 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:23:29.795322 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790164 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:23:29.795322 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790166 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:23:29.795322 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790168 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:23:29.795774 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790171 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:23:29.795774 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790173 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:23:29.795774 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790176 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:23:29.795774 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790178 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:23:29.795774 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790180 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:23:29.795774 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790188 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:23:29.795774 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790196 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:23:29.795774 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790199 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:23:29.795774 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790201 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:23:29.795774 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790204 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:23:29.795774 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790206 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:23:29.795774 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790209 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:23:29.795774 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790211 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:23:29.795774 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790214 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:23:29.795774 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790216 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:23:29.795774 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790219 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:23:29.795774 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790222 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:23:29.795774 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790224 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:23:29.795774 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790227 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:23:29.795774 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790229 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:23:29.796279 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790232 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:23:29.796279 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790234 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:23:29.796279 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790237 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:23:29.796279 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790239 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:23:29.796279 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790242 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:23:29.796279 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790244 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:23:29.796279 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790247 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:23:29.796279 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790249 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:23:29.796279 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790252 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:23:29.796279 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790254 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:23:29.796279 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790257 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:23:29.796279 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790260 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:23:29.796279 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790262 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:23:29.796279 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790265 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:23:29.796279 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790267 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:23:29.796279 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790269 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:23:29.796279 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790272 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:23:29.796279 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790275 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:23:29.796279 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790278 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:23:29.796279 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790287 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:23:29.796766 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790290 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:23:29.796766 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790292 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:23:29.796766 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.790295 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:23:29.796766 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.791062 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 14:23:29.798540 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.798517 2567 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 14:23:29.798575 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.798541 2567 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 14:23:29.798606 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798594 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:23:29.798606 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798600 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:23:29.798606 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798603 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:23:29.798606 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798606 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:23:29.798706 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798610 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:23:29.798706 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798613 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:23:29.798706 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798616 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:23:29.798706 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798618 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:23:29.798706 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798620 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:23:29.798706 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798623 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:23:29.798706 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798626 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:23:29.798706 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798628 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:23:29.798706 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798631 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:23:29.798706 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798633 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:23:29.798706 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798636 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:23:29.798706 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798638 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:23:29.798706 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798641 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:23:29.798706 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798644 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:23:29.798706 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798646 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:23:29.798706 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798648 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:23:29.798706 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798651 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:23:29.798706 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798653 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:23:29.798706 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798656 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:23:29.798706 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798659 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:23:29.798706 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798661 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:23:29.799243 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798664 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:23:29.799243 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798668 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:23:29.799243 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798672 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:23:29.799243 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798675 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:23:29.799243 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798677 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:23:29.799243 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798680 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:23:29.799243 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798682 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:23:29.799243 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798685 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:23:29.799243 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798688 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:23:29.799243 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798690 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:23:29.799243 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798693 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:23:29.799243 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798695 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:23:29.799243 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798698 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:23:29.799243 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798701 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:23:29.799243 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798704 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:23:29.799243 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798706 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:23:29.799243 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798709 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:23:29.799243 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798712 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:23:29.799243 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798715 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:23:29.799683 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798717 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:23:29.799683 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798720 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:23:29.799683 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798722 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:23:29.799683 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798725 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:23:29.799683 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798727 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:23:29.799683 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798730 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:23:29.799683 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798732 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:23:29.799683 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798735 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:23:29.799683 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798737 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:23:29.799683 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798740 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:23:29.799683 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798742 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:23:29.799683 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798745 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:23:29.799683 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798747 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:23:29.799683 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798750 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:23:29.799683 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798753 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:23:29.799683 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798755 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:23:29.799683 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798758 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:23:29.799683 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798760 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:23:29.799683 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798763 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:23:29.800180 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798765 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:23:29.800180 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798768 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:23:29.800180 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798770 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:23:29.800180 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798772 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:23:29.800180 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798775 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:23:29.800180 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798778 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:23:29.800180 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798781 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:23:29.800180 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798784 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:23:29.800180 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798786 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:23:29.800180 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798789 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:23:29.800180 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798791 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:23:29.800180 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798794 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:23:29.800180 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798797 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:23:29.800180 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798800 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:23:29.800180 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798803 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:23:29.800180 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798828 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:23:29.800180 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798832 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:23:29.800180 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798834 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:23:29.800180 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798837 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:23:29.800623 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798839 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:23:29.800623 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798841 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:23:29.800623 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798844 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:23:29.800623 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798847 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:23:29.800623 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.798852 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 14:23:29.800623 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798955 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:23:29.800623 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798959 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:23:29.800623 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798962 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:23:29.800623 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798965 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:23:29.800623 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798969 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:23:29.800623 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798971 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:23:29.800623 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798974 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:23:29.800623 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798976 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:23:29.800623 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798979 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:23:29.800623 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798982 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:23:29.801006 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798984 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:23:29.801006 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798986 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:23:29.801006 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798989 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:23:29.801006 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798991 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:23:29.801006 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798994 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:23:29.801006 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798996 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:23:29.801006 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.798999 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:23:29.801006 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799002 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:23:29.801006 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799005 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:23:29.801006 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799009 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:23:29.801006 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799012 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:23:29.801006 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799015 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:23:29.801006 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799017 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:23:29.801006 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799021 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:23:29.801006 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799024 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:23:29.801006 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799027 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:23:29.801006 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799029 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:23:29.801006 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799032 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:23:29.801006 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799035 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:23:29.801452 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799037 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:23:29.801452 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799039 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:23:29.801452 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799042 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:23:29.801452 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799045 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:23:29.801452 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799048 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:23:29.801452 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799050 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:23:29.801452 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799053 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:23:29.801452 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799055 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:23:29.801452 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799058 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:23:29.801452 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799061 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:23:29.801452 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799063 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:23:29.801452 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799066 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:23:29.801452 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799068 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:23:29.801452 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799070 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:23:29.801452 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799073 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:23:29.801452 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799076 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:23:29.801452 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799078 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:23:29.801452 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799080 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:23:29.801452 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799083 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:23:29.801452 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799086 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:23:29.802038 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799088 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:23:29.802038 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799091 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:23:29.802038 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799093 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:23:29.802038 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799096 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:23:29.802038 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799098 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:23:29.802038 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799101 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:23:29.802038 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799103 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:23:29.802038 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799106 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:23:29.802038 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799108 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:23:29.802038 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799111 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:23:29.802038 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799113 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:23:29.802038 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799116 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:23:29.802038 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799118 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:23:29.802038 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799121 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:23:29.802038 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799123 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:23:29.802038 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799126 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:23:29.802038 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799129 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:23:29.802038 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799133 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:23:29.802038 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799136 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:23:29.802480 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799138 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:23:29.802480 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799140 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:23:29.802480 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799144 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:23:29.802480 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799146 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:23:29.802480 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799148 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:23:29.802480 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799151 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:23:29.802480 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799153 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:23:29.802480 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799156 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:23:29.802480 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799158 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:23:29.802480 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799160 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:23:29.802480 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799163 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:23:29.802480 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799166 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:23:29.802480 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799169 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:23:29.802480 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799171 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:23:29.802480 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799174 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:23:29.802480 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799176 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:23:29.802480 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799178 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:23:29.802480 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:29.799181 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:23:29.802931 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.799185 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 14:23:29.802931 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.800022 2567 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 14:23:29.802931 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.802319 2567 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 14:23:29.803404 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.803392 2567 server.go:1019] "Starting client certificate rotation" Apr 24 14:23:29.803501 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.803484 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 14:23:29.804141 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.804130 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 14:23:29.832939 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.832909 2567 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 14:23:29.835978 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.835949 2567 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 14:23:29.854664 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.852916 2567 log.go:25] "Validated CRI v1 runtime API" Apr 24 14:23:29.862517 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.862494 2567 log.go:25] "Validated CRI v1 image API" Apr 24 14:23:29.865418 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.865390 2567 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 14:23:29.869491 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.869465 2567 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 ac35ece2-db19-47f9-bbd2-2d43e03bad5c:/dev/nvme0n1p3 df7979cd-5c7a-4c63-b4d0-b25b65e09aea:/dev/nvme0n1p4] Apr 24 14:23:29.869572 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.869490 2567 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 14:23:29.870063 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.870044 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 14:23:29.875783 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.875672 2567 manager.go:217] Machine: {Timestamp:2026-04-24 14:23:29.873567058 +0000 UTC m=+0.463719771 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3200186 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2b4ca346d9dc2616b341e181c49873 SystemUUID:ec2b4ca3-46d9-dc26-16b3-41e181c49873 BootID:fcffbd62-76bd-4013-8453-964643c79885 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:f3:ba:b0:da:47 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:f3:ba:b0:da:47 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:1a:7b:14:3b:a7:d2 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 14:23:29.875783 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.875776 2567 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 14:23:29.875900 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.875886 2567 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 14:23:29.876947 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.876924 2567 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 14:23:29.877097 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.876950 2567 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-36.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 14:23:29.877142 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.877107 2567 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 14:23:29.877142 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.877116 2567 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 14:23:29.877142 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.877129 2567 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 14:23:29.877219 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.877144 2567 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 14:23:29.878611 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.878601 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 24 14:23:29.878743 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.878735 2567 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 14:23:29.881669 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.881652 2567 kubelet.go:491] "Attempting to sync node with API server" Apr 24 14:23:29.881669 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.881671 2567 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 14:23:29.881841 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.881686 2567 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 14:23:29.881841 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.881699 2567 kubelet.go:397] "Adding apiserver pod source" Apr 24 14:23:29.881841 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.881720 2567 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 14:23:29.882936 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.882921 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 14:23:29.883003 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.882945 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 14:23:29.886316 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.886299 2567 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 14:23:29.887991 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.887978 2567 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 14:23:29.889055 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.889039 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 14:23:29.889099 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.889063 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 14:23:29.889099 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.889090 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 14:23:29.889099 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.889098 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 14:23:29.889182 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.889105 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 14:23:29.889182 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.889111 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 14:23:29.889182 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.889116 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 14:23:29.889182 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.889122 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 14:23:29.889182 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.889129 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 14:23:29.889182 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.889136 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 14:23:29.889182 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.889148 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 14:23:29.889579 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.889568 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 14:23:29.890594 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.890579 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 14:23:29.890594 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.890595 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 14:23:29.894383 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.894368 2567 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 14:23:29.894470 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.894409 2567 server.go:1295] "Started kubelet" Apr 24 14:23:29.894594 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.894541 2567 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 14:23:29.894647 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.894617 2567 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 14:23:29.895136 ip-10-0-128-36 systemd[1]: Started Kubernetes Kubelet. Apr 24 14:23:29.895670 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.895624 2567 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 14:23:29.896089 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.896061 2567 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 14:23:29.896176 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.896164 2567 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-36.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 14:23:29.896258 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:29.896234 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 14:23:29.896383 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:29.896365 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-36.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 14:23:29.898637 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.898622 2567 server.go:317] "Adding debug handlers to kubelet server" Apr 24 14:23:29.904717 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.904697 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 14:23:29.905105 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:29.904100 2567 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-36.ec2.internal.18a951050032b25c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-36.ec2.internal,UID:ip-10-0-128-36.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-128-36.ec2.internal,},FirstTimestamp:2026-04-24 14:23:29.894380124 +0000 UTC m=+0.484532836,LastTimestamp:2026-04-24 14:23:29.894380124 +0000 UTC m=+0.484532836,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-36.ec2.internal,}" Apr 24 14:23:29.905427 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.905409 2567 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 14:23:29.905600 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:29.905583 2567 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 14:23:29.907452 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.907333 2567 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 14:23:29.907546 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.907536 2567 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 14:23:29.907937 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:29.907916 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-36.ec2.internal\" not found" Apr 24 14:23:29.908377 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.908356 2567 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 14:23:29.909434 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.908617 2567 reconstruct.go:97] "Volume reconstruction finished" Apr 24 14:23:29.909434 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.908631 2567 reconciler.go:26] "Reconciler: start to sync state" Apr 24 14:23:29.909434 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.908683 2567 factory.go:55] Registering systemd factory Apr 24 14:23:29.909434 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.908699 2567 factory.go:223] Registration of the systemd container factory successfully Apr 24 14:23:29.909434 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.908959 2567 factory.go:153] Registering CRI-O factory Apr 24 14:23:29.909434 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.908982 2567 factory.go:223] Registration of the crio container factory successfully Apr 24 14:23:29.909434 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.909033 2567 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 14:23:29.909434 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.909060 2567 factory.go:103] Registering Raw factory Apr 24 14:23:29.909434 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.909075 2567 manager.go:1196] Started watching for new ooms in manager Apr 24 14:23:29.909893 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:29.909514 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 14:23:29.909893 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.909727 2567 manager.go:319] Starting recovery of all containers Apr 24 14:23:29.909893 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:29.909778 2567 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-128-36.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 14:23:29.913799 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.913771 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-n62lz" Apr 24 14:23:29.916028 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.915843 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 14:23:29.921101 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.921083 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-n62lz" Apr 24 14:23:29.921296 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.921283 2567 manager.go:324] Recovery completed Apr 24 14:23:29.925376 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.925364 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:23:29.930030 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.930013 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-36.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:23:29.930102 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.930041 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-36.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:23:29.930102 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.930051 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-36.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:23:29.930565 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.930550 2567 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 14:23:29.930565 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.930562 2567 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 14:23:29.930697 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.930579 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 24 14:23:29.932304 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:29.932233 2567 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-36.ec2.internal.18a951050252a684 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-36.ec2.internal,UID:ip-10-0-128-36.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-128-36.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-128-36.ec2.internal,},FirstTimestamp:2026-04-24 14:23:29.930028676 +0000 UTC m=+0.520181391,LastTimestamp:2026-04-24 14:23:29.930028676 +0000 UTC m=+0.520181391,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-36.ec2.internal,}" Apr 24 14:23:29.933447 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.933436 2567 policy_none.go:49] "None policy: Start" Apr 24 14:23:29.933496 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.933452 2567 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 14:23:29.933496 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.933462 2567 state_mem.go:35] "Initializing new in-memory state store" Apr 24 14:23:29.969440 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.969422 2567 manager.go:341] "Starting Device Plugin manager" Apr 24 14:23:29.980890 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:29.969470 2567 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 14:23:29.980890 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.969484 2567 server.go:85] "Starting device plugin registration server" Apr 24 14:23:29.980890 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.969766 2567 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 14:23:29.980890 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.969776 2567 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 14:23:29.980890 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.969892 2567 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 14:23:29.980890 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.969957 2567 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 14:23:29.980890 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:29.969965 2567 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 14:23:29.980890 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:29.970573 2567 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 14:23:29.980890 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:29.970618 2567 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-36.ec2.internal\" not found" Apr 24 14:23:30.017940 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.017908 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 14:23:30.017940 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.017944 2567 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 14:23:30.018111 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.017963 2567 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 14:23:30.018111 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.017970 2567 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 14:23:30.018111 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:30.018004 2567 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 14:23:30.021311 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.021290 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:23:30.070404 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.070331 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:23:30.071484 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.071467 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-36.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:23:30.071570 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.071501 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-36.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:23:30.071570 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.071514 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-36.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:23:30.071570 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.071545 2567 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-36.ec2.internal" Apr 24 14:23:30.079469 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.079449 2567 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-36.ec2.internal" Apr 24 14:23:30.079559 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:30.079477 2567 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-36.ec2.internal\": node \"ip-10-0-128-36.ec2.internal\" not found" Apr 24 14:23:30.097154 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:30.097120 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-36.ec2.internal\" not found" Apr 24 14:23:30.118735 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.118700 2567 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-36.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-128-36.ec2.internal"] Apr 24 14:23:30.118910 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.118776 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:23:30.121170 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.121155 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-36.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:23:30.121233 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.121184 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-36.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:23:30.121233 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.121194 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-36.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:23:30.122393 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.122380 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:23:30.122515 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.122500 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-36.ec2.internal" Apr 24 14:23:30.122559 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.122528 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:23:30.123087 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.123070 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-36.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:23:30.123087 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.123080 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-36.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:23:30.123202 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.123103 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-36.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:23:30.123202 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.123114 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-36.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:23:30.123202 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.123167 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-36.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:23:30.123202 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.123185 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-36.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:23:30.124942 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.124919 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-36.ec2.internal" Apr 24 14:23:30.125034 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.124964 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:23:30.125922 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.125905 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-36.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:23:30.126002 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.125931 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-36.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:23:30.126002 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.125941 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-36.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:23:30.150791 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:30.150761 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-36.ec2.internal\" not found" node="ip-10-0-128-36.ec2.internal" Apr 24 14:23:30.155128 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:30.155111 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-36.ec2.internal\" not found" node="ip-10-0-128-36.ec2.internal" Apr 24 14:23:30.197685 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:30.197656 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-36.ec2.internal\" not found" Apr 24 14:23:30.210375 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.210353 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1bd5d9b9689ebb5ba9a6f14c7f815d41-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-36.ec2.internal\" (UID: \"1bd5d9b9689ebb5ba9a6f14c7f815d41\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-36.ec2.internal" Apr 24 14:23:30.210444 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.210379 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1bd5d9b9689ebb5ba9a6f14c7f815d41-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-36.ec2.internal\" (UID: \"1bd5d9b9689ebb5ba9a6f14c7f815d41\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-36.ec2.internal" Apr 24 14:23:30.210444 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.210399 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ec6b2c56954869eef2ac269c4c51a583-config\") pod \"kube-apiserver-proxy-ip-10-0-128-36.ec2.internal\" (UID: \"ec6b2c56954869eef2ac269c4c51a583\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-36.ec2.internal" Apr 24 14:23:30.298677 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:30.298646 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-36.ec2.internal\" not found" Apr 24 14:23:30.311151 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.311128 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1bd5d9b9689ebb5ba9a6f14c7f815d41-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-36.ec2.internal\" (UID: \"1bd5d9b9689ebb5ba9a6f14c7f815d41\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-36.ec2.internal" Apr 24 14:23:30.311219 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.311148 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1bd5d9b9689ebb5ba9a6f14c7f815d41-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-36.ec2.internal\" (UID: \"1bd5d9b9689ebb5ba9a6f14c7f815d41\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-36.ec2.internal" Apr 24 14:23:30.311219 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.311168 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ec6b2c56954869eef2ac269c4c51a583-config\") pod \"kube-apiserver-proxy-ip-10-0-128-36.ec2.internal\" (UID: \"ec6b2c56954869eef2ac269c4c51a583\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-36.ec2.internal" Apr 24 14:23:30.311219 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.311202 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ec6b2c56954869eef2ac269c4c51a583-config\") pod \"kube-apiserver-proxy-ip-10-0-128-36.ec2.internal\" (UID: \"ec6b2c56954869eef2ac269c4c51a583\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-36.ec2.internal" Apr 24 14:23:30.311219 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.311203 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1bd5d9b9689ebb5ba9a6f14c7f815d41-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-36.ec2.internal\" (UID: \"1bd5d9b9689ebb5ba9a6f14c7f815d41\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-36.ec2.internal" Apr 24 14:23:30.311342 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.311223 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1bd5d9b9689ebb5ba9a6f14c7f815d41-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-36.ec2.internal\" (UID: \"1bd5d9b9689ebb5ba9a6f14c7f815d41\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-36.ec2.internal" Apr 24 14:23:30.399606 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:30.399534 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-36.ec2.internal\" not found" Apr 24 14:23:30.453201 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.453167 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-36.ec2.internal" Apr 24 14:23:30.457566 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.457403 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-36.ec2.internal" Apr 24 14:23:30.499964 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:30.499932 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-36.ec2.internal\" not found" Apr 24 14:23:30.600615 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:30.600583 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-36.ec2.internal\" not found" Apr 24 14:23:30.701352 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:30.701265 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-36.ec2.internal\" not found" Apr 24 14:23:30.801973 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:30.801936 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-36.ec2.internal\" not found" Apr 24 14:23:30.803035 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.803017 2567 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 14:23:30.803201 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.803185 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 14:23:30.902026 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:30.901998 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-36.ec2.internal\" not found" Apr 24 14:23:30.905107 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.905072 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 14:23:30.915485 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.915459 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 14:23:30.921729 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.921700 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 14:18:29 +0000 UTC" deadline="2028-01-23 03:27:29.458849892 +0000 UTC" Apr 24 14:23:30.921729 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.921727 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15325h3m58.537125775s" Apr 24 14:23:30.937857 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.937828 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-pnpbh" Apr 24 14:23:30.946592 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.946571 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-pnpbh" Apr 24 14:23:30.951308 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:30.951274 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bd5d9b9689ebb5ba9a6f14c7f815d41.slice/crio-d7e12db505ece0e8240c91e5f68d00b38de910a22fe57421d19eca1c4c935ca8 WatchSource:0}: Error finding container d7e12db505ece0e8240c91e5f68d00b38de910a22fe57421d19eca1c4c935ca8: Status 404 returned error can't find the container with id d7e12db505ece0e8240c91e5f68d00b38de910a22fe57421d19eca1c4c935ca8 Apr 24 14:23:30.951565 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:30.951523 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec6b2c56954869eef2ac269c4c51a583.slice/crio-ddcdf58cd62a216a67e53c2f0ef57f5b3da0b945e494a5d90651745fd7236d2e WatchSource:0}: Error finding container ddcdf58cd62a216a67e53c2f0ef57f5b3da0b945e494a5d90651745fd7236d2e: Status 404 returned error can't find the container with id ddcdf58cd62a216a67e53c2f0ef57f5b3da0b945e494a5d90651745fd7236d2e Apr 24 14:23:30.956203 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:30.956183 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 14:23:31.002274 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:31.002230 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-36.ec2.internal\" not found" Apr 24 14:23:31.020869 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.020797 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-36.ec2.internal" event={"ID":"1bd5d9b9689ebb5ba9a6f14c7f815d41","Type":"ContainerStarted","Data":"d7e12db505ece0e8240c91e5f68d00b38de910a22fe57421d19eca1c4c935ca8"} Apr 24 14:23:31.021675 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.021657 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-36.ec2.internal" event={"ID":"ec6b2c56954869eef2ac269c4c51a583","Type":"ContainerStarted","Data":"ddcdf58cd62a216a67e53c2f0ef57f5b3da0b945e494a5d90651745fd7236d2e"} Apr 24 14:23:31.026821 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.026790 2567 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:23:31.103286 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:31.103251 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-36.ec2.internal\" not found" Apr 24 14:23:31.203962 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:31.203881 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-36.ec2.internal\" not found" Apr 24 14:23:31.218281 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.218259 2567 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:23:31.305683 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.305654 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-36.ec2.internal" Apr 24 14:23:31.318989 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.318962 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 14:23:31.320827 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.320800 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-36.ec2.internal" Apr 24 14:23:31.328294 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.328266 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 14:23:31.497354 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.497265 2567 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:23:31.882912 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.882877 2567 apiserver.go:52] "Watching apiserver" Apr 24 14:23:31.888108 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.888084 2567 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 14:23:31.889861 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.889836 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-36.ec2.internal","openshift-ovn-kubernetes/ovnkube-node-8jd5s","kube-system/konnectivity-agent-vrgvl","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7npfb","openshift-image-registry/node-ca-m4w28","openshift-multus/multus-additional-cni-plugins-cb58k","openshift-multus/multus-h5ptb","openshift-multus/network-metrics-daemon-ct9nz","openshift-network-diagnostics/network-check-target-57rkt","openshift-network-operator/iptables-alerter-m2lt2","kube-system/kube-apiserver-proxy-ip-10-0-128-36.ec2.internal","openshift-cluster-node-tuning-operator/tuned-vl2zx"] Apr 24 14:23:31.891348 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.891327 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:31.893385 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.893353 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 14:23:31.893385 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.893368 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 14:23:31.893551 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.893450 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 14:23:31.893551 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.893504 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vrgvl" Apr 24 14:23:31.893650 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.893603 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7npfb" Apr 24 14:23:31.893968 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.893949 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 14:23:31.894510 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.894420 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 14:23:31.894510 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.894444 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 14:23:31.894652 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.894587 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-hb9sx\"" Apr 24 14:23:31.896044 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.895452 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 14:23:31.896044 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.895574 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 14:23:31.896044 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.895603 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 14:23:31.896044 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.895753 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-59jzc\"" Apr 24 14:23:31.896044 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.895766 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-fbpnt\"" Apr 24 14:23:31.896044 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.895845 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 14:23:31.896826 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.896496 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 14:23:31.896826 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.896590 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-m4w28" Apr 24 14:23:31.898241 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.898199 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 14:23:31.898336 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.898267 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-96gr7\"" Apr 24 14:23:31.898442 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.898426 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 14:23:31.898525 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.898506 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 14:23:31.899741 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.899688 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-cb58k" Apr 24 14:23:31.899851 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.899762 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-h5ptb" Apr 24 14:23:31.901219 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.901186 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ct9nz" Apr 24 14:23:31.901304 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:31.901289 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ct9nz" podUID="952d5757-28bc-4940-9fa6-4a50ffff6476" Apr 24 14:23:31.902013 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.901596 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 14:23:31.902013 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.901641 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-mg2j4\"" Apr 24 14:23:31.902013 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.901742 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-kmxbf\"" Apr 24 14:23:31.902013 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.901789 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 14:23:31.902013 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.901794 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 14:23:31.902013 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.901861 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 14:23:31.902376 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.902068 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 14:23:31.902376 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.902159 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 14:23:31.903447 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.903428 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57rkt" Apr 24 14:23:31.903538 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:31.903495 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57rkt" podUID="f33ebad9-63f4-4a25-865f-68c02ee70c85" Apr 24 14:23:31.903600 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.903589 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-m2lt2" Apr 24 14:23:31.904953 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.904937 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:31.905471 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.905451 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 14:23:31.905471 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.905453 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 14:23:31.905608 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.905529 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 14:23:31.905742 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.905726 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-862gl\"" Apr 24 14:23:31.906708 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.906691 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 14:23:31.906792 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.906755 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-hzp8q\"" Apr 24 14:23:31.906959 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.906801 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 14:23:31.909312 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.909296 2567 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 14:23:31.919936 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.919918 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-log-socket\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:31.920009 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.919946 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e0bd6039-b2d8-405a-b478-69690078dd73-ovnkube-config\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:31.920009 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.919969 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9ngj\" (UniqueName: \"kubernetes.io/projected/5fc12c1f-0a1b-49b5-b20f-0208c496ba66-kube-api-access-b9ngj\") pod \"iptables-alerter-m2lt2\" (UID: \"5fc12c1f-0a1b-49b5-b20f-0208c496ba66\") " pod="openshift-network-operator/iptables-alerter-m2lt2" Apr 24 14:23:31.920009 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.920002 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a071a258-93d2-4209-8cc4-3e5105208e68-registration-dir\") pod \"aws-ebs-csi-driver-node-7npfb\" (UID: \"a071a258-93d2-4209-8cc4-3e5105208e68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7npfb" Apr 24 14:23:31.920133 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.920045 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvh6j\" (UniqueName: \"kubernetes.io/projected/2a44ddf6-7291-4385-a9e5-9e6dd777407e-kube-api-access-dvh6j\") pod \"node-ca-m4w28\" (UID: \"2a44ddf6-7291-4385-a9e5-9e6dd777407e\") " pod="openshift-image-registry/node-ca-m4w28" Apr 24 14:23:31.920133 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.920078 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-host-cni-netd\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:31.920133 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.920114 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5fc12c1f-0a1b-49b5-b20f-0208c496ba66-iptables-alerter-script\") pod \"iptables-alerter-m2lt2\" (UID: \"5fc12c1f-0a1b-49b5-b20f-0208c496ba66\") " pod="openshift-network-operator/iptables-alerter-m2lt2" Apr 24 14:23:31.920257 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.920139 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1fba2eb1-536b-4442-bf40-5af241dd98fd-etc-sysconfig\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:31.920257 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.920162 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1fba2eb1-536b-4442-bf40-5af241dd98fd-etc-kubernetes\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:31.920257 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.920191 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1fba2eb1-536b-4442-bf40-5af241dd98fd-etc-sysctl-conf\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:31.920257 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.920220 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1fba2eb1-536b-4442-bf40-5af241dd98fd-run\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:31.920477 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.920244 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/04393dd2-c684-4592-bc88-2223fac95a11-os-release\") pod \"multus-additional-cni-plugins-cb58k\" (UID: \"04393dd2-c684-4592-bc88-2223fac95a11\") " pod="openshift-multus/multus-additional-cni-plugins-cb58k" Apr 24 14:23:31.920520 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.920495 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/04393dd2-c684-4592-bc88-2223fac95a11-cni-binary-copy\") pod \"multus-additional-cni-plugins-cb58k\" (UID: \"04393dd2-c684-4592-bc88-2223fac95a11\") " pod="openshift-multus/multus-additional-cni-plugins-cb58k" Apr 24 14:23:31.920551 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.920520 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzql6\" (UniqueName: \"kubernetes.io/projected/f33ebad9-63f4-4a25-865f-68c02ee70c85-kube-api-access-tzql6\") pod \"network-check-target-57rkt\" (UID: \"f33ebad9-63f4-4a25-865f-68c02ee70c85\") " pod="openshift-network-diagnostics/network-check-target-57rkt" Apr 24 14:23:31.920587 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.920560 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-host-cni-bin\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:31.920619 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.920582 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1fba2eb1-536b-4442-bf40-5af241dd98fd-host\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:31.920619 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.920607 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a071a258-93d2-4209-8cc4-3e5105208e68-etc-selinux\") pod \"aws-ebs-csi-driver-node-7npfb\" (UID: \"a071a258-93d2-4209-8cc4-3e5105208e68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7npfb" Apr 24 14:23:31.920707 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.920631 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-multus-cni-dir\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:31.920707 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.920660 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-host-var-lib-kubelet\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:31.920707 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.920683 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv6zc\" (UniqueName: \"kubernetes.io/projected/04393dd2-c684-4592-bc88-2223fac95a11-kube-api-access-dv6zc\") pod \"multus-additional-cni-plugins-cb58k\" (UID: \"04393dd2-c684-4592-bc88-2223fac95a11\") " pod="openshift-multus/multus-additional-cni-plugins-cb58k" Apr 24 14:23:31.920897 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.920706 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-hostroot\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:31.920897 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.920750 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-run-ovn\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:31.920897 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.920774 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a071a258-93d2-4209-8cc4-3e5105208e68-sys-fs\") pod \"aws-ebs-csi-driver-node-7npfb\" (UID: \"a071a258-93d2-4209-8cc4-3e5105208e68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7npfb" Apr 24 14:23:31.920897 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.920820 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2a44ddf6-7291-4385-a9e5-9e6dd777407e-serviceca\") pod \"node-ca-m4w28\" (UID: \"2a44ddf6-7291-4385-a9e5-9e6dd777407e\") " pod="openshift-image-registry/node-ca-m4w28" Apr 24 14:23:31.920897 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.920858 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/04393dd2-c684-4592-bc88-2223fac95a11-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cb58k\" (UID: \"04393dd2-c684-4592-bc88-2223fac95a11\") " pod="openshift-multus/multus-additional-cni-plugins-cb58k" Apr 24 14:23:31.920897 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.920891 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/04393dd2-c684-4592-bc88-2223fac95a11-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-cb58k\" (UID: \"04393dd2-c684-4592-bc88-2223fac95a11\") " pod="openshift-multus/multus-additional-cni-plugins-cb58k" Apr 24 14:23:31.921191 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.920915 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-systemd-units\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:31.921191 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.920944 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-host-slash\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:31.921191 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.920969 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-host-run-netns\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:31.921191 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.921006 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-run-openvswitch\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:31.921191 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.921031 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-node-log\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:31.921191 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.921078 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/04393dd2-c684-4592-bc88-2223fac95a11-cnibin\") pod \"multus-additional-cni-plugins-cb58k\" (UID: \"04393dd2-c684-4592-bc88-2223fac95a11\") " pod="openshift-multus/multus-additional-cni-plugins-cb58k" Apr 24 14:23:31.921191 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.921094 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvs6j\" (UniqueName: \"kubernetes.io/projected/952d5757-28bc-4940-9fa6-4a50ffff6476-kube-api-access-hvs6j\") pod \"network-metrics-daemon-ct9nz\" (UID: \"952d5757-28bc-4940-9fa6-4a50ffff6476\") " pod="openshift-multus/network-metrics-daemon-ct9nz" Apr 24 14:23:31.921191 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.921113 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-etc-openvswitch\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:31.921191 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.921158 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:31.921191 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.921181 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e0bd6039-b2d8-405a-b478-69690078dd73-ovnkube-script-lib\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:31.921191 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.921195 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1fba2eb1-536b-4442-bf40-5af241dd98fd-etc-systemd\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:31.921755 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.921221 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-multus-daemon-config\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:31.921755 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.921242 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-host-run-multus-certs\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:31.921755 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.921257 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/952d5757-28bc-4940-9fa6-4a50ffff6476-metrics-certs\") pod \"network-metrics-daemon-ct9nz\" (UID: \"952d5757-28bc-4940-9fa6-4a50ffff6476\") " pod="openshift-multus/network-metrics-daemon-ct9nz" Apr 24 14:23:31.921755 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.921271 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tzl7\" (UniqueName: \"kubernetes.io/projected/e0bd6039-b2d8-405a-b478-69690078dd73-kube-api-access-2tzl7\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:31.921755 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.921301 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1fba2eb1-536b-4442-bf40-5af241dd98fd-etc-modprobe-d\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:31.921755 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.921318 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r54w4\" (UniqueName: \"kubernetes.io/projected/1fba2eb1-536b-4442-bf40-5af241dd98fd-kube-api-access-r54w4\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:31.921755 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.921346 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/04393dd2-c684-4592-bc88-2223fac95a11-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cb58k\" (UID: \"04393dd2-c684-4592-bc88-2223fac95a11\") " pod="openshift-multus/multus-additional-cni-plugins-cb58k" Apr 24 14:23:31.921755 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.921381 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-run-systemd\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:31.921755 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.921413 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-host-run-ovn-kubernetes\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:31.921755 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.921438 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a071a258-93d2-4209-8cc4-3e5105208e68-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7npfb\" (UID: \"a071a258-93d2-4209-8cc4-3e5105208e68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7npfb" Apr 24 14:23:31.921755 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.921478 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-host-run-netns\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:31.921755 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.921502 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-host-var-lib-cni-bin\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:31.921755 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.921519 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-host-kubelet\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:31.921755 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.921544 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e0bd6039-b2d8-405a-b478-69690078dd73-ovn-node-metrics-cert\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:31.921755 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.921583 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1fba2eb1-536b-4442-bf40-5af241dd98fd-etc-tuned\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:31.921755 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.921602 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a071a258-93d2-4209-8cc4-3e5105208e68-socket-dir\") pod \"aws-ebs-csi-driver-node-7npfb\" (UID: \"a071a258-93d2-4209-8cc4-3e5105208e68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7npfb" Apr 24 14:23:31.922476 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.921618 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-os-release\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:31.922476 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.921631 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-multus-socket-dir-parent\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:31.922476 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.921644 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-etc-kubernetes\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:31.922476 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.921665 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-var-lib-openvswitch\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:31.922476 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.921727 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a071a258-93d2-4209-8cc4-3e5105208e68-device-dir\") pod \"aws-ebs-csi-driver-node-7npfb\" (UID: \"a071a258-93d2-4209-8cc4-3e5105208e68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7npfb" Apr 24 14:23:31.922476 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.921755 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-host-run-k8s-cni-cncf-io\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:31.922476 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.921779 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1fba2eb1-536b-4442-bf40-5af241dd98fd-tmp\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:31.922476 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.921836 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1a01a64d-13d5-4b58-b51d-c8bda8dbefb6-agent-certs\") pod \"konnectivity-agent-vrgvl\" (UID: \"1a01a64d-13d5-4b58-b51d-c8bda8dbefb6\") " pod="kube-system/konnectivity-agent-vrgvl" Apr 24 14:23:31.922476 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.921861 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-cnibin\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:31.922476 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.921875 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-host-var-lib-cni-multus\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:31.922476 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.921892 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5fc12c1f-0a1b-49b5-b20f-0208c496ba66-host-slash\") pod \"iptables-alerter-m2lt2\" (UID: \"5fc12c1f-0a1b-49b5-b20f-0208c496ba66\") " pod="openshift-network-operator/iptables-alerter-m2lt2" Apr 24 14:23:31.922476 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.921918 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1fba2eb1-536b-4442-bf40-5af241dd98fd-var-lib-kubelet\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:31.922476 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.921935 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1a01a64d-13d5-4b58-b51d-c8bda8dbefb6-konnectivity-ca\") pod \"konnectivity-agent-vrgvl\" (UID: \"1a01a64d-13d5-4b58-b51d-c8bda8dbefb6\") " pod="kube-system/konnectivity-agent-vrgvl" Apr 24 14:23:31.922476 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.921961 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04393dd2-c684-4592-bc88-2223fac95a11-system-cni-dir\") pod \"multus-additional-cni-plugins-cb58k\" (UID: \"04393dd2-c684-4592-bc88-2223fac95a11\") " pod="openshift-multus/multus-additional-cni-plugins-cb58k" Apr 24 14:23:31.922476 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.921983 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1fba2eb1-536b-4442-bf40-5af241dd98fd-etc-sysctl-d\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:31.922476 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.922013 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mh88\" (UniqueName: \"kubernetes.io/projected/a071a258-93d2-4209-8cc4-3e5105208e68-kube-api-access-4mh88\") pod \"aws-ebs-csi-driver-node-7npfb\" (UID: \"a071a258-93d2-4209-8cc4-3e5105208e68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7npfb" Apr 24 14:23:31.922476 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.922035 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-system-cni-dir\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:31.923230 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.922056 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-multus-conf-dir\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:31.923230 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.922092 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e0bd6039-b2d8-405a-b478-69690078dd73-env-overrides\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:31.923230 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.922122 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1fba2eb1-536b-4442-bf40-5af241dd98fd-sys\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:31.923230 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.922141 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1fba2eb1-536b-4442-bf40-5af241dd98fd-lib-modules\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:31.923230 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.922161 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2a44ddf6-7291-4385-a9e5-9e6dd777407e-host\") pod \"node-ca-m4w28\" (UID: \"2a44ddf6-7291-4385-a9e5-9e6dd777407e\") " pod="openshift-image-registry/node-ca-m4w28" Apr 24 14:23:31.923230 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.922195 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-cni-binary-copy\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:31.923230 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.922256 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bhnt\" (UniqueName: \"kubernetes.io/projected/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-kube-api-access-4bhnt\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:31.947270 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.947241 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 14:18:30 +0000 UTC" deadline="2028-01-11 23:15:53.43562199 +0000 UTC" Apr 24 14:23:31.947270 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:31.947270 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15056h52m21.488355671s" Apr 24 14:23:32.022540 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.022514 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a071a258-93d2-4209-8cc4-3e5105208e68-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7npfb\" (UID: \"a071a258-93d2-4209-8cc4-3e5105208e68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7npfb" Apr 24 14:23:32.022540 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.022543 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-host-run-netns\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:32.022756 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.022558 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-host-var-lib-cni-bin\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:32.022756 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.022580 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-host-kubelet\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.022756 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.022604 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e0bd6039-b2d8-405a-b478-69690078dd73-ovn-node-metrics-cert\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.022756 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.022627 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1fba2eb1-536b-4442-bf40-5af241dd98fd-etc-tuned\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:32.022756 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.022639 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-host-run-netns\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:32.022756 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.022655 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-host-var-lib-cni-bin\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:32.022756 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.022652 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a071a258-93d2-4209-8cc4-3e5105208e68-socket-dir\") pod \"aws-ebs-csi-driver-node-7npfb\" (UID: \"a071a258-93d2-4209-8cc4-3e5105208e68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7npfb" Apr 24 14:23:32.022756 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.022639 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a071a258-93d2-4209-8cc4-3e5105208e68-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7npfb\" (UID: \"a071a258-93d2-4209-8cc4-3e5105208e68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7npfb" Apr 24 14:23:32.022756 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.022691 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-os-release\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:32.022756 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.022755 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-os-release\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:32.023224 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.022765 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-host-kubelet\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.023224 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.022782 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-multus-socket-dir-parent\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:32.023224 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.022844 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-etc-kubernetes\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:32.023224 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.022866 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-var-lib-openvswitch\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.023224 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.022911 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-etc-kubernetes\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:32.023224 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.022941 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a071a258-93d2-4209-8cc4-3e5105208e68-device-dir\") pod \"aws-ebs-csi-driver-node-7npfb\" (UID: \"a071a258-93d2-4209-8cc4-3e5105208e68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7npfb" Apr 24 14:23:32.023224 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.022966 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-var-lib-openvswitch\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.023224 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.022975 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a071a258-93d2-4209-8cc4-3e5105208e68-socket-dir\") pod \"aws-ebs-csi-driver-node-7npfb\" (UID: \"a071a258-93d2-4209-8cc4-3e5105208e68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7npfb" Apr 24 14:23:32.023224 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.023030 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-multus-socket-dir-parent\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:32.023224 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.023053 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a071a258-93d2-4209-8cc4-3e5105208e68-device-dir\") pod \"aws-ebs-csi-driver-node-7npfb\" (UID: \"a071a258-93d2-4209-8cc4-3e5105208e68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7npfb" Apr 24 14:23:32.023224 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.023073 2567 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 14:23:32.023224 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.023108 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-host-run-k8s-cni-cncf-io\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:32.023744 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.023286 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-host-run-k8s-cni-cncf-io\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:32.023744 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.023325 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1fba2eb1-536b-4442-bf40-5af241dd98fd-tmp\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:32.023744 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.023350 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1a01a64d-13d5-4b58-b51d-c8bda8dbefb6-agent-certs\") pod \"konnectivity-agent-vrgvl\" (UID: \"1a01a64d-13d5-4b58-b51d-c8bda8dbefb6\") " pod="kube-system/konnectivity-agent-vrgvl" Apr 24 14:23:32.023744 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.023372 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-cnibin\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:32.023744 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.023416 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-host-var-lib-cni-multus\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:32.023744 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.023441 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5fc12c1f-0a1b-49b5-b20f-0208c496ba66-host-slash\") pod \"iptables-alerter-m2lt2\" (UID: \"5fc12c1f-0a1b-49b5-b20f-0208c496ba66\") " pod="openshift-network-operator/iptables-alerter-m2lt2" Apr 24 14:23:32.023744 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.023466 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1fba2eb1-536b-4442-bf40-5af241dd98fd-var-lib-kubelet\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:32.023744 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.023465 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-cnibin\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:32.023744 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.023488 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1a01a64d-13d5-4b58-b51d-c8bda8dbefb6-konnectivity-ca\") pod \"konnectivity-agent-vrgvl\" (UID: \"1a01a64d-13d5-4b58-b51d-c8bda8dbefb6\") " pod="kube-system/konnectivity-agent-vrgvl" Apr 24 14:23:32.023744 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.023519 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5fc12c1f-0a1b-49b5-b20f-0208c496ba66-host-slash\") pod \"iptables-alerter-m2lt2\" (UID: \"5fc12c1f-0a1b-49b5-b20f-0208c496ba66\") " pod="openshift-network-operator/iptables-alerter-m2lt2" Apr 24 14:23:32.023744 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.023566 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-host-var-lib-cni-multus\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:32.023744 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.023557 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1fba2eb1-536b-4442-bf40-5af241dd98fd-var-lib-kubelet\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:32.023744 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.023606 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04393dd2-c684-4592-bc88-2223fac95a11-system-cni-dir\") pod \"multus-additional-cni-plugins-cb58k\" (UID: \"04393dd2-c684-4592-bc88-2223fac95a11\") " pod="openshift-multus/multus-additional-cni-plugins-cb58k" Apr 24 14:23:32.023744 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.023635 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1fba2eb1-536b-4442-bf40-5af241dd98fd-etc-sysctl-d\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:32.023744 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.023662 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4mh88\" (UniqueName: \"kubernetes.io/projected/a071a258-93d2-4209-8cc4-3e5105208e68-kube-api-access-4mh88\") pod \"aws-ebs-csi-driver-node-7npfb\" (UID: \"a071a258-93d2-4209-8cc4-3e5105208e68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7npfb" Apr 24 14:23:32.023744 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.023681 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04393dd2-c684-4592-bc88-2223fac95a11-system-cni-dir\") pod \"multus-additional-cni-plugins-cb58k\" (UID: \"04393dd2-c684-4592-bc88-2223fac95a11\") " pod="openshift-multus/multus-additional-cni-plugins-cb58k" Apr 24 14:23:32.023744 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.023685 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-system-cni-dir\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:32.023744 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.023724 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-system-cni-dir\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:32.024536 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.023732 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-multus-conf-dir\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:32.024536 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.023755 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e0bd6039-b2d8-405a-b478-69690078dd73-env-overrides\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.024536 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.023778 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1fba2eb1-536b-4442-bf40-5af241dd98fd-sys\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:32.024536 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.023799 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1fba2eb1-536b-4442-bf40-5af241dd98fd-lib-modules\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:32.024536 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.023839 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2a44ddf6-7291-4385-a9e5-9e6dd777407e-host\") pod \"node-ca-m4w28\" (UID: \"2a44ddf6-7291-4385-a9e5-9e6dd777407e\") " pod="openshift-image-registry/node-ca-m4w28" Apr 24 14:23:32.024536 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.023861 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-cni-binary-copy\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:32.024536 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.023867 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1fba2eb1-536b-4442-bf40-5af241dd98fd-etc-sysctl-d\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:32.024536 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.023885 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4bhnt\" (UniqueName: \"kubernetes.io/projected/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-kube-api-access-4bhnt\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:32.024536 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.023909 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-log-socket\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.024536 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.023915 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1fba2eb1-536b-4442-bf40-5af241dd98fd-sys\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:32.024536 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.023931 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e0bd6039-b2d8-405a-b478-69690078dd73-ovnkube-config\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.024536 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.023954 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-multus-conf-dir\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:32.024536 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.023969 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b9ngj\" (UniqueName: \"kubernetes.io/projected/5fc12c1f-0a1b-49b5-b20f-0208c496ba66-kube-api-access-b9ngj\") pod \"iptables-alerter-m2lt2\" (UID: \"5fc12c1f-0a1b-49b5-b20f-0208c496ba66\") " pod="openshift-network-operator/iptables-alerter-m2lt2" Apr 24 14:23:32.024536 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.023996 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a071a258-93d2-4209-8cc4-3e5105208e68-registration-dir\") pod \"aws-ebs-csi-driver-node-7npfb\" (UID: \"a071a258-93d2-4209-8cc4-3e5105208e68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7npfb" Apr 24 14:23:32.024536 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024020 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dvh6j\" (UniqueName: \"kubernetes.io/projected/2a44ddf6-7291-4385-a9e5-9e6dd777407e-kube-api-access-dvh6j\") pod \"node-ca-m4w28\" (UID: \"2a44ddf6-7291-4385-a9e5-9e6dd777407e\") " pod="openshift-image-registry/node-ca-m4w28" Apr 24 14:23:32.024536 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024043 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-host-cni-netd\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.024536 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024067 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5fc12c1f-0a1b-49b5-b20f-0208c496ba66-iptables-alerter-script\") pod \"iptables-alerter-m2lt2\" (UID: \"5fc12c1f-0a1b-49b5-b20f-0208c496ba66\") " pod="openshift-network-operator/iptables-alerter-m2lt2" Apr 24 14:23:32.024536 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024072 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1a01a64d-13d5-4b58-b51d-c8bda8dbefb6-konnectivity-ca\") pod \"konnectivity-agent-vrgvl\" (UID: \"1a01a64d-13d5-4b58-b51d-c8bda8dbefb6\") " pod="kube-system/konnectivity-agent-vrgvl" Apr 24 14:23:32.025350 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024092 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1fba2eb1-536b-4442-bf40-5af241dd98fd-etc-sysconfig\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:32.025350 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024135 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1fba2eb1-536b-4442-bf40-5af241dd98fd-etc-kubernetes\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:32.025350 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024139 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1fba2eb1-536b-4442-bf40-5af241dd98fd-etc-sysconfig\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:32.025350 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024160 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1fba2eb1-536b-4442-bf40-5af241dd98fd-etc-sysctl-conf\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:32.025350 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024185 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1fba2eb1-536b-4442-bf40-5af241dd98fd-run\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:32.025350 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024213 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/04393dd2-c684-4592-bc88-2223fac95a11-os-release\") pod \"multus-additional-cni-plugins-cb58k\" (UID: \"04393dd2-c684-4592-bc88-2223fac95a11\") " pod="openshift-multus/multus-additional-cni-plugins-cb58k" Apr 24 14:23:32.025350 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024240 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/04393dd2-c684-4592-bc88-2223fac95a11-cni-binary-copy\") pod \"multus-additional-cni-plugins-cb58k\" (UID: \"04393dd2-c684-4592-bc88-2223fac95a11\") " pod="openshift-multus/multus-additional-cni-plugins-cb58k" Apr 24 14:23:32.025350 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024239 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1fba2eb1-536b-4442-bf40-5af241dd98fd-lib-modules\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:32.025350 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024284 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzql6\" (UniqueName: \"kubernetes.io/projected/f33ebad9-63f4-4a25-865f-68c02ee70c85-kube-api-access-tzql6\") pod \"network-check-target-57rkt\" (UID: \"f33ebad9-63f4-4a25-865f-68c02ee70c85\") " pod="openshift-network-diagnostics/network-check-target-57rkt" Apr 24 14:23:32.025350 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024319 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-host-cni-bin\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.025350 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024343 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1fba2eb1-536b-4442-bf40-5af241dd98fd-host\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:32.025350 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024367 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a071a258-93d2-4209-8cc4-3e5105208e68-etc-selinux\") pod \"aws-ebs-csi-driver-node-7npfb\" (UID: \"a071a258-93d2-4209-8cc4-3e5105208e68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7npfb" Apr 24 14:23:32.025350 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024393 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-multus-cni-dir\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:32.025350 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024420 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-host-var-lib-kubelet\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:32.025350 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024480 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dv6zc\" (UniqueName: \"kubernetes.io/projected/04393dd2-c684-4592-bc88-2223fac95a11-kube-api-access-dv6zc\") pod \"multus-additional-cni-plugins-cb58k\" (UID: \"04393dd2-c684-4592-bc88-2223fac95a11\") " pod="openshift-multus/multus-additional-cni-plugins-cb58k" Apr 24 14:23:32.025350 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024508 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-hostroot\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:32.025350 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024532 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-run-ovn\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.026140 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024557 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a071a258-93d2-4209-8cc4-3e5105208e68-sys-fs\") pod \"aws-ebs-csi-driver-node-7npfb\" (UID: \"a071a258-93d2-4209-8cc4-3e5105208e68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7npfb" Apr 24 14:23:32.026140 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024581 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2a44ddf6-7291-4385-a9e5-9e6dd777407e-serviceca\") pod \"node-ca-m4w28\" (UID: \"2a44ddf6-7291-4385-a9e5-9e6dd777407e\") " pod="openshift-image-registry/node-ca-m4w28" Apr 24 14:23:32.026140 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024590 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1fba2eb1-536b-4442-bf40-5af241dd98fd-etc-kubernetes\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:32.026140 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024604 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/04393dd2-c684-4592-bc88-2223fac95a11-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cb58k\" (UID: \"04393dd2-c684-4592-bc88-2223fac95a11\") " pod="openshift-multus/multus-additional-cni-plugins-cb58k" Apr 24 14:23:32.026140 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024630 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/04393dd2-c684-4592-bc88-2223fac95a11-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-cb58k\" (UID: \"04393dd2-c684-4592-bc88-2223fac95a11\") " pod="openshift-multus/multus-additional-cni-plugins-cb58k" Apr 24 14:23:32.026140 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024652 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-systemd-units\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.026140 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024675 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-host-slash\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.026140 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024695 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-host-run-netns\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.026140 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024719 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1fba2eb1-536b-4442-bf40-5af241dd98fd-etc-sysctl-conf\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:32.026140 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024721 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-run-openvswitch\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.026140 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024716 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e0bd6039-b2d8-405a-b478-69690078dd73-ovnkube-config\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.026140 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024757 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-run-openvswitch\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.026140 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024757 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-node-log\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.026140 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024793 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/04393dd2-c684-4592-bc88-2223fac95a11-cnibin\") pod \"multus-additional-cni-plugins-cb58k\" (UID: \"04393dd2-c684-4592-bc88-2223fac95a11\") " pod="openshift-multus/multus-additional-cni-plugins-cb58k" Apr 24 14:23:32.026140 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024800 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1fba2eb1-536b-4442-bf40-5af241dd98fd-run\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:32.026140 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024835 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hvs6j\" (UniqueName: \"kubernetes.io/projected/952d5757-28bc-4940-9fa6-4a50ffff6476-kube-api-access-hvs6j\") pod \"network-metrics-daemon-ct9nz\" (UID: \"952d5757-28bc-4940-9fa6-4a50ffff6476\") " pod="openshift-multus/network-metrics-daemon-ct9nz" Apr 24 14:23:32.026140 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024841 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a071a258-93d2-4209-8cc4-3e5105208e68-registration-dir\") pod \"aws-ebs-csi-driver-node-7npfb\" (UID: \"a071a258-93d2-4209-8cc4-3e5105208e68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7npfb" Apr 24 14:23:32.026966 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024867 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-etc-openvswitch\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.026966 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024895 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.026966 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024912 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-host-cni-netd\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.026966 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024918 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/04393dd2-c684-4592-bc88-2223fac95a11-os-release\") pod \"multus-additional-cni-plugins-cb58k\" (UID: \"04393dd2-c684-4592-bc88-2223fac95a11\") " pod="openshift-multus/multus-additional-cni-plugins-cb58k" Apr 24 14:23:32.026966 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024922 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e0bd6039-b2d8-405a-b478-69690078dd73-ovnkube-script-lib\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.026966 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024946 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1fba2eb1-536b-4442-bf40-5af241dd98fd-etc-systemd\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:32.026966 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024974 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-multus-daemon-config\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:32.026966 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024999 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-host-run-multus-certs\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:32.026966 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.025024 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/952d5757-28bc-4940-9fa6-4a50ffff6476-metrics-certs\") pod \"network-metrics-daemon-ct9nz\" (UID: \"952d5757-28bc-4940-9fa6-4a50ffff6476\") " pod="openshift-multus/network-metrics-daemon-ct9nz" Apr 24 14:23:32.026966 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.025049 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2tzl7\" (UniqueName: \"kubernetes.io/projected/e0bd6039-b2d8-405a-b478-69690078dd73-kube-api-access-2tzl7\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.026966 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.025075 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1fba2eb1-536b-4442-bf40-5af241dd98fd-etc-modprobe-d\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:32.026966 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.025099 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r54w4\" (UniqueName: \"kubernetes.io/projected/1fba2eb1-536b-4442-bf40-5af241dd98fd-kube-api-access-r54w4\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:32.026966 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.025144 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/04393dd2-c684-4592-bc88-2223fac95a11-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cb58k\" (UID: \"04393dd2-c684-4592-bc88-2223fac95a11\") " pod="openshift-multus/multus-additional-cni-plugins-cb58k" Apr 24 14:23:32.026966 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.025190 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-run-systemd\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.026966 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.025229 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-host-run-ovn-kubernetes\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.026966 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.025305 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-host-slash\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.026966 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.025343 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/04393dd2-c684-4592-bc88-2223fac95a11-cnibin\") pod \"multus-additional-cni-plugins-cb58k\" (UID: \"04393dd2-c684-4592-bc88-2223fac95a11\") " pod="openshift-multus/multus-additional-cni-plugins-cb58k" Apr 24 14:23:32.027631 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.025381 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2a44ddf6-7291-4385-a9e5-9e6dd777407e-host\") pod \"node-ca-m4w28\" (UID: \"2a44ddf6-7291-4385-a9e5-9e6dd777407e\") " pod="openshift-image-registry/node-ca-m4w28" Apr 24 14:23:32.027631 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.025471 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5fc12c1f-0a1b-49b5-b20f-0208c496ba66-iptables-alerter-script\") pod \"iptables-alerter-m2lt2\" (UID: \"5fc12c1f-0a1b-49b5-b20f-0208c496ba66\") " pod="openshift-network-operator/iptables-alerter-m2lt2" Apr 24 14:23:32.027631 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.025518 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-etc-openvswitch\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.027631 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.024868 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-node-log\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.027631 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.025573 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.027631 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.025917 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-cni-binary-copy\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:32.027631 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.025968 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-log-socket\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.027631 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.026170 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e0bd6039-b2d8-405a-b478-69690078dd73-env-overrides\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.027631 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.026305 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-host-run-multus-certs\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:32.027631 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:32.026435 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:23:32.027631 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:32.026520 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/952d5757-28bc-4940-9fa6-4a50ffff6476-metrics-certs podName:952d5757-28bc-4940-9fa6-4a50ffff6476 nodeName:}" failed. No retries permitted until 2026-04-24 14:23:32.526502036 +0000 UTC m=+3.116654754 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/952d5757-28bc-4940-9fa6-4a50ffff6476-metrics-certs") pod "network-metrics-daemon-ct9nz" (UID: "952d5757-28bc-4940-9fa6-4a50ffff6476") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:23:32.027631 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.026579 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1fba2eb1-536b-4442-bf40-5af241dd98fd-etc-tuned\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:32.027631 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.026635 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-multus-daemon-config\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:32.027631 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.026661 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1fba2eb1-536b-4442-bf40-5af241dd98fd-etc-systemd\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:32.027631 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.026683 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e0bd6039-b2d8-405a-b478-69690078dd73-ovnkube-script-lib\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.027631 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.026704 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-run-ovn\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.027631 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.026782 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-multus-cni-dir\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:32.027631 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.026833 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1fba2eb1-536b-4442-bf40-5af241dd98fd-etc-modprobe-d\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:32.028450 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.026849 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-host-var-lib-kubelet\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:32.028450 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.026909 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a071a258-93d2-4209-8cc4-3e5105208e68-etc-selinux\") pod \"aws-ebs-csi-driver-node-7npfb\" (UID: \"a071a258-93d2-4209-8cc4-3e5105208e68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7npfb" Apr 24 14:23:32.028450 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.026926 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-run-systemd\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.028450 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.026956 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1fba2eb1-536b-4442-bf40-5af241dd98fd-host\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:32.028450 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.026990 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-hostroot\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:32.028450 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.027036 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a071a258-93d2-4209-8cc4-3e5105208e68-sys-fs\") pod \"aws-ebs-csi-driver-node-7npfb\" (UID: \"a071a258-93d2-4209-8cc4-3e5105208e68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7npfb" Apr 24 14:23:32.028450 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.027034 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/04393dd2-c684-4592-bc88-2223fac95a11-cni-binary-copy\") pod \"multus-additional-cni-plugins-cb58k\" (UID: \"04393dd2-c684-4592-bc88-2223fac95a11\") " pod="openshift-multus/multus-additional-cni-plugins-cb58k" Apr 24 14:23:32.028450 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.027058 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-host-run-ovn-kubernetes\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.028450 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.027106 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-systemd-units\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.028450 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.025306 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-host-run-netns\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.028450 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.027154 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e0bd6039-b2d8-405a-b478-69690078dd73-host-cni-bin\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.028450 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.027168 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/04393dd2-c684-4592-bc88-2223fac95a11-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cb58k\" (UID: \"04393dd2-c684-4592-bc88-2223fac95a11\") " pod="openshift-multus/multus-additional-cni-plugins-cb58k" Apr 24 14:23:32.028450 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.027243 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1fba2eb1-536b-4442-bf40-5af241dd98fd-tmp\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:32.028450 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.027350 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/04393dd2-c684-4592-bc88-2223fac95a11-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-cb58k\" (UID: \"04393dd2-c684-4592-bc88-2223fac95a11\") " pod="openshift-multus/multus-additional-cni-plugins-cb58k" Apr 24 14:23:32.028450 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.027426 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2a44ddf6-7291-4385-a9e5-9e6dd777407e-serviceca\") pod \"node-ca-m4w28\" (UID: \"2a44ddf6-7291-4385-a9e5-9e6dd777407e\") " pod="openshift-image-registry/node-ca-m4w28" Apr 24 14:23:32.028450 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.027455 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1a01a64d-13d5-4b58-b51d-c8bda8dbefb6-agent-certs\") pod \"konnectivity-agent-vrgvl\" (UID: \"1a01a64d-13d5-4b58-b51d-c8bda8dbefb6\") " pod="kube-system/konnectivity-agent-vrgvl" Apr 24 14:23:32.028450 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.027527 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e0bd6039-b2d8-405a-b478-69690078dd73-ovn-node-metrics-cert\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.029194 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.027585 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/04393dd2-c684-4592-bc88-2223fac95a11-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cb58k\" (UID: \"04393dd2-c684-4592-bc88-2223fac95a11\") " pod="openshift-multus/multus-additional-cni-plugins-cb58k" Apr 24 14:23:32.030664 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:32.030641 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:23:32.030772 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:32.030701 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:23:32.030772 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:32.030717 2567 projected.go:194] Error preparing data for projected volume kube-api-access-tzql6 for pod openshift-network-diagnostics/network-check-target-57rkt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:23:32.030896 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:32.030865 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f33ebad9-63f4-4a25-865f-68c02ee70c85-kube-api-access-tzql6 podName:f33ebad9-63f4-4a25-865f-68c02ee70c85 nodeName:}" failed. No retries permitted until 2026-04-24 14:23:32.530843131 +0000 UTC m=+3.120995835 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-tzql6" (UniqueName: "kubernetes.io/projected/f33ebad9-63f4-4a25-865f-68c02ee70c85-kube-api-access-tzql6") pod "network-check-target-57rkt" (UID: "f33ebad9-63f4-4a25-865f-68c02ee70c85") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:23:32.032143 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.032121 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mh88\" (UniqueName: \"kubernetes.io/projected/a071a258-93d2-4209-8cc4-3e5105208e68-kube-api-access-4mh88\") pod \"aws-ebs-csi-driver-node-7npfb\" (UID: \"a071a258-93d2-4209-8cc4-3e5105208e68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7npfb" Apr 24 14:23:32.032457 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.032441 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bhnt\" (UniqueName: \"kubernetes.io/projected/b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73-kube-api-access-4bhnt\") pod \"multus-h5ptb\" (UID: \"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73\") " pod="openshift-multus/multus-h5ptb" Apr 24 14:23:32.034060 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.032695 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvh6j\" (UniqueName: \"kubernetes.io/projected/2a44ddf6-7291-4385-a9e5-9e6dd777407e-kube-api-access-dvh6j\") pod \"node-ca-m4w28\" (UID: \"2a44ddf6-7291-4385-a9e5-9e6dd777407e\") " pod="openshift-image-registry/node-ca-m4w28" Apr 24 14:23:32.034060 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.033157 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9ngj\" (UniqueName: \"kubernetes.io/projected/5fc12c1f-0a1b-49b5-b20f-0208c496ba66-kube-api-access-b9ngj\") pod \"iptables-alerter-m2lt2\" (UID: \"5fc12c1f-0a1b-49b5-b20f-0208c496ba66\") " pod="openshift-network-operator/iptables-alerter-m2lt2" Apr 24 14:23:32.036620 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.036579 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r54w4\" (UniqueName: \"kubernetes.io/projected/1fba2eb1-536b-4442-bf40-5af241dd98fd-kube-api-access-r54w4\") pod \"tuned-vl2zx\" (UID: \"1fba2eb1-536b-4442-bf40-5af241dd98fd\") " pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:32.037271 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.036767 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv6zc\" (UniqueName: \"kubernetes.io/projected/04393dd2-c684-4592-bc88-2223fac95a11-kube-api-access-dv6zc\") pod \"multus-additional-cni-plugins-cb58k\" (UID: \"04393dd2-c684-4592-bc88-2223fac95a11\") " pod="openshift-multus/multus-additional-cni-plugins-cb58k" Apr 24 14:23:32.037271 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.036903 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvs6j\" (UniqueName: \"kubernetes.io/projected/952d5757-28bc-4940-9fa6-4a50ffff6476-kube-api-access-hvs6j\") pod \"network-metrics-daemon-ct9nz\" (UID: \"952d5757-28bc-4940-9fa6-4a50ffff6476\") " pod="openshift-multus/network-metrics-daemon-ct9nz" Apr 24 14:23:32.037418 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.037280 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tzl7\" (UniqueName: \"kubernetes.io/projected/e0bd6039-b2d8-405a-b478-69690078dd73-kube-api-access-2tzl7\") pod \"ovnkube-node-8jd5s\" (UID: \"e0bd6039-b2d8-405a-b478-69690078dd73\") " pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.116685 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.116611 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:23:32.205411 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.205380 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:32.212365 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.212328 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vrgvl" Apr 24 14:23:32.219802 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.219779 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7npfb" Apr 24 14:23:32.224351 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.224321 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-m4w28" Apr 24 14:23:32.232034 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.232013 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-cb58k" Apr 24 14:23:32.240540 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.240519 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-h5ptb" Apr 24 14:23:32.247103 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.247078 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-m2lt2" Apr 24 14:23:32.251606 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.251588 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" Apr 24 14:23:32.528674 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.528593 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/952d5757-28bc-4940-9fa6-4a50ffff6476-metrics-certs\") pod \"network-metrics-daemon-ct9nz\" (UID: \"952d5757-28bc-4940-9fa6-4a50ffff6476\") " pod="openshift-multus/network-metrics-daemon-ct9nz" Apr 24 14:23:32.528829 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:32.528721 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:23:32.528829 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:32.528783 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/952d5757-28bc-4940-9fa6-4a50ffff6476-metrics-certs podName:952d5757-28bc-4940-9fa6-4a50ffff6476 nodeName:}" failed. No retries permitted until 2026-04-24 14:23:33.528763043 +0000 UTC m=+4.118915745 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/952d5757-28bc-4940-9fa6-4a50ffff6476-metrics-certs") pod "network-metrics-daemon-ct9nz" (UID: "952d5757-28bc-4940-9fa6-4a50ffff6476") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:23:32.629509 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.629482 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzql6\" (UniqueName: \"kubernetes.io/projected/f33ebad9-63f4-4a25-865f-68c02ee70c85-kube-api-access-tzql6\") pod \"network-check-target-57rkt\" (UID: \"f33ebad9-63f4-4a25-865f-68c02ee70c85\") " pod="openshift-network-diagnostics/network-check-target-57rkt" Apr 24 14:23:32.629644 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:32.629607 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:23:32.629644 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:32.629623 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:23:32.629644 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:32.629634 2567 projected.go:194] Error preparing data for projected volume kube-api-access-tzql6 for pod openshift-network-diagnostics/network-check-target-57rkt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:23:32.629741 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:32.629680 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f33ebad9-63f4-4a25-865f-68c02ee70c85-kube-api-access-tzql6 podName:f33ebad9-63f4-4a25-865f-68c02ee70c85 nodeName:}" failed. No retries permitted until 2026-04-24 14:23:33.62966731 +0000 UTC m=+4.219820023 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-tzql6" (UniqueName: "kubernetes.io/projected/f33ebad9-63f4-4a25-865f-68c02ee70c85-kube-api-access-tzql6") pod "network-check-target-57rkt" (UID: "f33ebad9-63f4-4a25-865f-68c02ee70c85") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:23:32.651344 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:32.651308 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fc12c1f_0a1b_49b5_b20f_0208c496ba66.slice/crio-401e5e40db7bad16cf5d2c104ff908203a7c49f3f9aa79b2e880a4d82dd8ed43 WatchSource:0}: Error finding container 401e5e40db7bad16cf5d2c104ff908203a7c49f3f9aa79b2e880a4d82dd8ed43: Status 404 returned error can't find the container with id 401e5e40db7bad16cf5d2c104ff908203a7c49f3f9aa79b2e880a4d82dd8ed43 Apr 24 14:23:32.652992 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:32.652967 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a01a64d_13d5_4b58_b51d_c8bda8dbefb6.slice/crio-98552739525f19ffc7f25698d47893a227a6317297d80be2afcd9e24e50d78f6 WatchSource:0}: Error finding container 98552739525f19ffc7f25698d47893a227a6317297d80be2afcd9e24e50d78f6: Status 404 returned error can't find the container with id 98552739525f19ffc7f25698d47893a227a6317297d80be2afcd9e24e50d78f6 Apr 24 14:23:32.653654 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:32.653630 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fba2eb1_536b_4442_bf40_5af241dd98fd.slice/crio-dbf29941f6be9edd8e9ab4546f0bd3eeef65ba6561f79c53c3e1171de9c6ceab WatchSource:0}: Error finding container dbf29941f6be9edd8e9ab4546f0bd3eeef65ba6561f79c53c3e1171de9c6ceab: Status 404 returned error can't find the container with id dbf29941f6be9edd8e9ab4546f0bd3eeef65ba6561f79c53c3e1171de9c6ceab Apr 24 14:23:32.657350 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:32.657323 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a44ddf6_7291_4385_a9e5_9e6dd777407e.slice/crio-452d9b2c3048d42eb6d12d984bebd8995e86806009fca11e7c8362800e3b824d WatchSource:0}: Error finding container 452d9b2c3048d42eb6d12d984bebd8995e86806009fca11e7c8362800e3b824d: Status 404 returned error can't find the container with id 452d9b2c3048d42eb6d12d984bebd8995e86806009fca11e7c8362800e3b824d Apr 24 14:23:32.658004 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:32.657980 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04393dd2_c684_4592_bc88_2223fac95a11.slice/crio-3aa170196b92da183d771420fd1781c08bc7a389d1b5cccb2c4e564d00a11e39 WatchSource:0}: Error finding container 3aa170196b92da183d771420fd1781c08bc7a389d1b5cccb2c4e564d00a11e39: Status 404 returned error can't find the container with id 3aa170196b92da183d771420fd1781c08bc7a389d1b5cccb2c4e564d00a11e39 Apr 24 14:23:32.659000 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:32.658796 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0bd6039_b2d8_405a_b478_69690078dd73.slice/crio-8da4dfad84dcb8a8c0b5f21ef3150d5e7ff27d572c5bf19bc5865831fac95cb5 WatchSource:0}: Error finding container 8da4dfad84dcb8a8c0b5f21ef3150d5e7ff27d572c5bf19bc5865831fac95cb5: Status 404 returned error can't find the container with id 8da4dfad84dcb8a8c0b5f21ef3150d5e7ff27d572c5bf19bc5865831fac95cb5 Apr 24 14:23:32.661296 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:32.661275 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5fcbe2e_31cb_4da0_bddb_8eec79e0ca73.slice/crio-535805a127948ad4b78441e2fb08c50b3b2eada6ee6cbc157413baeb1dc079b3 WatchSource:0}: Error finding container 535805a127948ad4b78441e2fb08c50b3b2eada6ee6cbc157413baeb1dc079b3: Status 404 returned error can't find the container with id 535805a127948ad4b78441e2fb08c50b3b2eada6ee6cbc157413baeb1dc079b3 Apr 24 14:23:32.663644 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:32.663623 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda071a258_93d2_4209_8cc4_3e5105208e68.slice/crio-e78bb3e9521c62cc1124b054ae61a9ab0365ddc65555df9091783ee5e93b6745 WatchSource:0}: Error finding container e78bb3e9521c62cc1124b054ae61a9ab0365ddc65555df9091783ee5e93b6745: Status 404 returned error can't find the container with id e78bb3e9521c62cc1124b054ae61a9ab0365ddc65555df9091783ee5e93b6745 Apr 24 14:23:32.948136 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.947890 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 14:18:30 +0000 UTC" deadline="2027-10-26 05:30:48.436454129 +0000 UTC" Apr 24 14:23:32.948136 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:32.948066 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13191h7m15.488391225s" Apr 24 14:23:33.018966 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:33.018918 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ct9nz" Apr 24 14:23:33.019380 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:33.019303 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ct9nz" podUID="952d5757-28bc-4940-9fa6-4a50ffff6476" Apr 24 14:23:33.029555 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:33.029526 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7npfb" event={"ID":"a071a258-93d2-4209-8cc4-3e5105208e68","Type":"ContainerStarted","Data":"e78bb3e9521c62cc1124b054ae61a9ab0365ddc65555df9091783ee5e93b6745"} Apr 24 14:23:33.031988 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:33.031957 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-m4w28" event={"ID":"2a44ddf6-7291-4385-a9e5-9e6dd777407e","Type":"ContainerStarted","Data":"452d9b2c3048d42eb6d12d984bebd8995e86806009fca11e7c8362800e3b824d"} Apr 24 14:23:33.035225 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:33.035176 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vrgvl" event={"ID":"1a01a64d-13d5-4b58-b51d-c8bda8dbefb6","Type":"ContainerStarted","Data":"98552739525f19ffc7f25698d47893a227a6317297d80be2afcd9e24e50d78f6"} Apr 24 14:23:33.044583 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:33.044012 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-36.ec2.internal" event={"ID":"ec6b2c56954869eef2ac269c4c51a583","Type":"ContainerStarted","Data":"e4e9d4d0130dc04e9dc6934714c3bd13aeabd9585524187c931ee2d1f8a7e8ad"} Apr 24 14:23:33.053039 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:33.053011 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h5ptb" event={"ID":"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73","Type":"ContainerStarted","Data":"535805a127948ad4b78441e2fb08c50b3b2eada6ee6cbc157413baeb1dc079b3"} Apr 24 14:23:33.054928 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:33.054874 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" event={"ID":"e0bd6039-b2d8-405a-b478-69690078dd73","Type":"ContainerStarted","Data":"8da4dfad84dcb8a8c0b5f21ef3150d5e7ff27d572c5bf19bc5865831fac95cb5"} Apr 24 14:23:33.057836 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:33.057161 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-36.ec2.internal" podStartSLOduration=2.057146932 podStartE2EDuration="2.057146932s" podCreationTimestamp="2026-04-24 14:23:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:23:33.056893478 +0000 UTC m=+3.647046202" watchObservedRunningTime="2026-04-24 14:23:33.057146932 +0000 UTC m=+3.647299655" Apr 24 14:23:33.063227 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:33.062987 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cb58k" event={"ID":"04393dd2-c684-4592-bc88-2223fac95a11","Type":"ContainerStarted","Data":"3aa170196b92da183d771420fd1781c08bc7a389d1b5cccb2c4e564d00a11e39"} Apr 24 14:23:33.067082 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:33.067009 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" event={"ID":"1fba2eb1-536b-4442-bf40-5af241dd98fd","Type":"ContainerStarted","Data":"dbf29941f6be9edd8e9ab4546f0bd3eeef65ba6561f79c53c3e1171de9c6ceab"} Apr 24 14:23:33.070061 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:33.070036 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-m2lt2" event={"ID":"5fc12c1f-0a1b-49b5-b20f-0208c496ba66","Type":"ContainerStarted","Data":"401e5e40db7bad16cf5d2c104ff908203a7c49f3f9aa79b2e880a4d82dd8ed43"} Apr 24 14:23:33.536835 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:33.536548 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/952d5757-28bc-4940-9fa6-4a50ffff6476-metrics-certs\") pod \"network-metrics-daemon-ct9nz\" (UID: \"952d5757-28bc-4940-9fa6-4a50ffff6476\") " pod="openshift-multus/network-metrics-daemon-ct9nz" Apr 24 14:23:33.537044 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:33.536842 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:23:33.537044 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:33.536913 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/952d5757-28bc-4940-9fa6-4a50ffff6476-metrics-certs podName:952d5757-28bc-4940-9fa6-4a50ffff6476 nodeName:}" failed. No retries permitted until 2026-04-24 14:23:35.5368943 +0000 UTC m=+6.127047004 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/952d5757-28bc-4940-9fa6-4a50ffff6476-metrics-certs") pod "network-metrics-daemon-ct9nz" (UID: "952d5757-28bc-4940-9fa6-4a50ffff6476") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:23:33.638200 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:33.637489 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzql6\" (UniqueName: \"kubernetes.io/projected/f33ebad9-63f4-4a25-865f-68c02ee70c85-kube-api-access-tzql6\") pod \"network-check-target-57rkt\" (UID: \"f33ebad9-63f4-4a25-865f-68c02ee70c85\") " pod="openshift-network-diagnostics/network-check-target-57rkt" Apr 24 14:23:33.638200 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:33.637719 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:23:33.638200 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:33.637742 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:23:33.638200 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:33.637755 2567 projected.go:194] Error preparing data for projected volume kube-api-access-tzql6 for pod openshift-network-diagnostics/network-check-target-57rkt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:23:33.638200 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:33.637860 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f33ebad9-63f4-4a25-865f-68c02ee70c85-kube-api-access-tzql6 podName:f33ebad9-63f4-4a25-865f-68c02ee70c85 nodeName:}" failed. No retries permitted until 2026-04-24 14:23:35.637841363 +0000 UTC m=+6.227994065 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-tzql6" (UniqueName: "kubernetes.io/projected/f33ebad9-63f4-4a25-865f-68c02ee70c85-kube-api-access-tzql6") pod "network-check-target-57rkt" (UID: "f33ebad9-63f4-4a25-865f-68c02ee70c85") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:23:34.021413 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:34.021380 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57rkt" Apr 24 14:23:34.021820 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:34.021505 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57rkt" podUID="f33ebad9-63f4-4a25-865f-68c02ee70c85" Apr 24 14:23:34.085886 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:34.085778 2567 generic.go:358] "Generic (PLEG): container finished" podID="1bd5d9b9689ebb5ba9a6f14c7f815d41" containerID="2c7a88df0e51dcb75faff89d61e23ca747bee3588043bbc7018ec5c256767a83" exitCode=0 Apr 24 14:23:34.086034 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:34.085904 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-36.ec2.internal" event={"ID":"1bd5d9b9689ebb5ba9a6f14c7f815d41","Type":"ContainerDied","Data":"2c7a88df0e51dcb75faff89d61e23ca747bee3588043bbc7018ec5c256767a83"} Apr 24 14:23:35.018199 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:35.018168 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ct9nz" Apr 24 14:23:35.018411 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:35.018317 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ct9nz" podUID="952d5757-28bc-4940-9fa6-4a50ffff6476" Apr 24 14:23:35.091655 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:35.091314 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-36.ec2.internal" event={"ID":"1bd5d9b9689ebb5ba9a6f14c7f815d41","Type":"ContainerStarted","Data":"bba30628d7c7d818c494e18402856a6320dbd2d0dbc1a5318446194d849f5af3"} Apr 24 14:23:35.104864 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:35.104798 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-36.ec2.internal" podStartSLOduration=4.10478117 podStartE2EDuration="4.10478117s" podCreationTimestamp="2026-04-24 14:23:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:23:35.10410744 +0000 UTC m=+5.694260163" watchObservedRunningTime="2026-04-24 14:23:35.10478117 +0000 UTC m=+5.694933895" Apr 24 14:23:35.554148 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:35.554090 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/952d5757-28bc-4940-9fa6-4a50ffff6476-metrics-certs\") pod \"network-metrics-daemon-ct9nz\" (UID: \"952d5757-28bc-4940-9fa6-4a50ffff6476\") " pod="openshift-multus/network-metrics-daemon-ct9nz" Apr 24 14:23:35.554316 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:35.554295 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:23:35.554509 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:35.554366 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/952d5757-28bc-4940-9fa6-4a50ffff6476-metrics-certs podName:952d5757-28bc-4940-9fa6-4a50ffff6476 nodeName:}" failed. No retries permitted until 2026-04-24 14:23:39.554348049 +0000 UTC m=+10.144500776 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/952d5757-28bc-4940-9fa6-4a50ffff6476-metrics-certs") pod "network-metrics-daemon-ct9nz" (UID: "952d5757-28bc-4940-9fa6-4a50ffff6476") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:23:35.654845 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:35.654761 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzql6\" (UniqueName: \"kubernetes.io/projected/f33ebad9-63f4-4a25-865f-68c02ee70c85-kube-api-access-tzql6\") pod \"network-check-target-57rkt\" (UID: \"f33ebad9-63f4-4a25-865f-68c02ee70c85\") " pod="openshift-network-diagnostics/network-check-target-57rkt" Apr 24 14:23:35.655029 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:35.654944 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:23:35.655029 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:35.654966 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:23:35.655029 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:35.654979 2567 projected.go:194] Error preparing data for projected volume kube-api-access-tzql6 for pod openshift-network-diagnostics/network-check-target-57rkt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:23:35.655190 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:35.655040 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f33ebad9-63f4-4a25-865f-68c02ee70c85-kube-api-access-tzql6 podName:f33ebad9-63f4-4a25-865f-68c02ee70c85 nodeName:}" failed. No retries permitted until 2026-04-24 14:23:39.655022229 +0000 UTC m=+10.245174943 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-tzql6" (UniqueName: "kubernetes.io/projected/f33ebad9-63f4-4a25-865f-68c02ee70c85-kube-api-access-tzql6") pod "network-check-target-57rkt" (UID: "f33ebad9-63f4-4a25-865f-68c02ee70c85") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:23:36.019302 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:36.019210 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57rkt" Apr 24 14:23:36.019467 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:36.019359 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57rkt" podUID="f33ebad9-63f4-4a25-865f-68c02ee70c85" Apr 24 14:23:37.018189 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:37.018156 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ct9nz" Apr 24 14:23:37.018635 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:37.018298 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ct9nz" podUID="952d5757-28bc-4940-9fa6-4a50ffff6476" Apr 24 14:23:38.019095 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:38.019061 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57rkt" Apr 24 14:23:38.019546 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:38.019193 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57rkt" podUID="f33ebad9-63f4-4a25-865f-68c02ee70c85" Apr 24 14:23:39.018520 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:39.018483 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ct9nz" Apr 24 14:23:39.018717 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:39.018602 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ct9nz" podUID="952d5757-28bc-4940-9fa6-4a50ffff6476" Apr 24 14:23:39.586641 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:39.586346 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/952d5757-28bc-4940-9fa6-4a50ffff6476-metrics-certs\") pod \"network-metrics-daemon-ct9nz\" (UID: \"952d5757-28bc-4940-9fa6-4a50ffff6476\") " pod="openshift-multus/network-metrics-daemon-ct9nz" Apr 24 14:23:39.586641 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:39.586535 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:23:39.586641 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:39.586605 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/952d5757-28bc-4940-9fa6-4a50ffff6476-metrics-certs podName:952d5757-28bc-4940-9fa6-4a50ffff6476 nodeName:}" failed. No retries permitted until 2026-04-24 14:23:47.586585923 +0000 UTC m=+18.176738624 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/952d5757-28bc-4940-9fa6-4a50ffff6476-metrics-certs") pod "network-metrics-daemon-ct9nz" (UID: "952d5757-28bc-4940-9fa6-4a50ffff6476") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:23:39.687074 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:39.687038 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzql6\" (UniqueName: \"kubernetes.io/projected/f33ebad9-63f4-4a25-865f-68c02ee70c85-kube-api-access-tzql6\") pod \"network-check-target-57rkt\" (UID: \"f33ebad9-63f4-4a25-865f-68c02ee70c85\") " pod="openshift-network-diagnostics/network-check-target-57rkt" Apr 24 14:23:39.687263 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:39.687240 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:23:39.687327 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:39.687270 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:23:39.687327 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:39.687285 2567 projected.go:194] Error preparing data for projected volume kube-api-access-tzql6 for pod openshift-network-diagnostics/network-check-target-57rkt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:23:39.687436 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:39.687348 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f33ebad9-63f4-4a25-865f-68c02ee70c85-kube-api-access-tzql6 podName:f33ebad9-63f4-4a25-865f-68c02ee70c85 nodeName:}" failed. No retries permitted until 2026-04-24 14:23:47.687330196 +0000 UTC m=+18.277482910 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-tzql6" (UniqueName: "kubernetes.io/projected/f33ebad9-63f4-4a25-865f-68c02ee70c85-kube-api-access-tzql6") pod "network-check-target-57rkt" (UID: "f33ebad9-63f4-4a25-865f-68c02ee70c85") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:23:40.021132 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:40.020183 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57rkt" Apr 24 14:23:40.021132 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:40.020340 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57rkt" podUID="f33ebad9-63f4-4a25-865f-68c02ee70c85" Apr 24 14:23:41.018606 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:41.018567 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ct9nz" Apr 24 14:23:41.019071 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:41.018716 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ct9nz" podUID="952d5757-28bc-4940-9fa6-4a50ffff6476" Apr 24 14:23:42.018677 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:42.018643 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57rkt" Apr 24 14:23:42.019181 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:42.018755 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57rkt" podUID="f33ebad9-63f4-4a25-865f-68c02ee70c85" Apr 24 14:23:43.018636 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:43.018600 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ct9nz" Apr 24 14:23:43.018837 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:43.018713 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ct9nz" podUID="952d5757-28bc-4940-9fa6-4a50ffff6476" Apr 24 14:23:43.409921 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:43.409844 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-p222m"] Apr 24 14:23:43.461522 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:43.461488 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-p222m" Apr 24 14:23:43.463560 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:43.463535 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 14:23:43.463690 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:43.463564 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 14:23:43.464037 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:43.464019 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-pmznq\"" Apr 24 14:23:43.515586 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:43.515556 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxb9z\" (UniqueName: \"kubernetes.io/projected/fbd73a0c-457f-455c-b2ca-ef248d74efc8-kube-api-access-jxb9z\") pod \"node-resolver-p222m\" (UID: \"fbd73a0c-457f-455c-b2ca-ef248d74efc8\") " pod="openshift-dns/node-resolver-p222m" Apr 24 14:23:43.515750 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:43.515636 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fbd73a0c-457f-455c-b2ca-ef248d74efc8-tmp-dir\") pod \"node-resolver-p222m\" (UID: \"fbd73a0c-457f-455c-b2ca-ef248d74efc8\") " pod="openshift-dns/node-resolver-p222m" Apr 24 14:23:43.515750 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:43.515677 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fbd73a0c-457f-455c-b2ca-ef248d74efc8-hosts-file\") pod \"node-resolver-p222m\" (UID: \"fbd73a0c-457f-455c-b2ca-ef248d74efc8\") " pod="openshift-dns/node-resolver-p222m" Apr 24 14:23:43.616995 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:43.616957 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fbd73a0c-457f-455c-b2ca-ef248d74efc8-tmp-dir\") pod \"node-resolver-p222m\" (UID: \"fbd73a0c-457f-455c-b2ca-ef248d74efc8\") " pod="openshift-dns/node-resolver-p222m" Apr 24 14:23:43.616995 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:43.616997 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fbd73a0c-457f-455c-b2ca-ef248d74efc8-hosts-file\") pod \"node-resolver-p222m\" (UID: \"fbd73a0c-457f-455c-b2ca-ef248d74efc8\") " pod="openshift-dns/node-resolver-p222m" Apr 24 14:23:43.617196 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:43.617081 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fbd73a0c-457f-455c-b2ca-ef248d74efc8-hosts-file\") pod \"node-resolver-p222m\" (UID: \"fbd73a0c-457f-455c-b2ca-ef248d74efc8\") " pod="openshift-dns/node-resolver-p222m" Apr 24 14:23:43.617196 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:43.617118 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jxb9z\" (UniqueName: \"kubernetes.io/projected/fbd73a0c-457f-455c-b2ca-ef248d74efc8-kube-api-access-jxb9z\") pod \"node-resolver-p222m\" (UID: \"fbd73a0c-457f-455c-b2ca-ef248d74efc8\") " pod="openshift-dns/node-resolver-p222m" Apr 24 14:23:43.617339 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:43.617317 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fbd73a0c-457f-455c-b2ca-ef248d74efc8-tmp-dir\") pod \"node-resolver-p222m\" (UID: \"fbd73a0c-457f-455c-b2ca-ef248d74efc8\") " pod="openshift-dns/node-resolver-p222m" Apr 24 14:23:43.625540 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:43.625504 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxb9z\" (UniqueName: \"kubernetes.io/projected/fbd73a0c-457f-455c-b2ca-ef248d74efc8-kube-api-access-jxb9z\") pod \"node-resolver-p222m\" (UID: \"fbd73a0c-457f-455c-b2ca-ef248d74efc8\") " pod="openshift-dns/node-resolver-p222m" Apr 24 14:23:43.770830 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:43.770736 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-p222m" Apr 24 14:23:44.018722 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:44.018686 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57rkt" Apr 24 14:23:44.018903 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:44.018854 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57rkt" podUID="f33ebad9-63f4-4a25-865f-68c02ee70c85" Apr 24 14:23:45.018241 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:45.018151 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ct9nz" Apr 24 14:23:45.018410 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:45.018265 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ct9nz" podUID="952d5757-28bc-4940-9fa6-4a50ffff6476" Apr 24 14:23:46.019047 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:46.019010 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57rkt" Apr 24 14:23:46.019460 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:46.019136 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57rkt" podUID="f33ebad9-63f4-4a25-865f-68c02ee70c85" Apr 24 14:23:47.018690 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:47.018664 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ct9nz" Apr 24 14:23:47.018907 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:47.018785 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ct9nz" podUID="952d5757-28bc-4940-9fa6-4a50ffff6476" Apr 24 14:23:47.646286 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:47.646231 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/952d5757-28bc-4940-9fa6-4a50ffff6476-metrics-certs\") pod \"network-metrics-daemon-ct9nz\" (UID: \"952d5757-28bc-4940-9fa6-4a50ffff6476\") " pod="openshift-multus/network-metrics-daemon-ct9nz" Apr 24 14:23:47.646739 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:47.646388 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:23:47.646739 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:47.646474 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/952d5757-28bc-4940-9fa6-4a50ffff6476-metrics-certs podName:952d5757-28bc-4940-9fa6-4a50ffff6476 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:03.646451829 +0000 UTC m=+34.236604532 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/952d5757-28bc-4940-9fa6-4a50ffff6476-metrics-certs") pod "network-metrics-daemon-ct9nz" (UID: "952d5757-28bc-4940-9fa6-4a50ffff6476") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:23:47.747469 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:47.747431 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzql6\" (UniqueName: \"kubernetes.io/projected/f33ebad9-63f4-4a25-865f-68c02ee70c85-kube-api-access-tzql6\") pod \"network-check-target-57rkt\" (UID: \"f33ebad9-63f4-4a25-865f-68c02ee70c85\") " pod="openshift-network-diagnostics/network-check-target-57rkt" Apr 24 14:23:47.747624 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:47.747556 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:23:47.747624 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:47.747575 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:23:47.747624 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:47.747585 2567 projected.go:194] Error preparing data for projected volume kube-api-access-tzql6 for pod openshift-network-diagnostics/network-check-target-57rkt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:23:47.747722 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:47.747650 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f33ebad9-63f4-4a25-865f-68c02ee70c85-kube-api-access-tzql6 podName:f33ebad9-63f4-4a25-865f-68c02ee70c85 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:03.747631028 +0000 UTC m=+34.337783729 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-tzql6" (UniqueName: "kubernetes.io/projected/f33ebad9-63f4-4a25-865f-68c02ee70c85-kube-api-access-tzql6") pod "network-check-target-57rkt" (UID: "f33ebad9-63f4-4a25-865f-68c02ee70c85") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:23:48.019145 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:48.019062 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57rkt" Apr 24 14:23:48.019283 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:48.019193 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57rkt" podUID="f33ebad9-63f4-4a25-865f-68c02ee70c85" Apr 24 14:23:49.018710 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:49.018689 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ct9nz" Apr 24 14:23:49.018956 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:49.018792 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ct9nz" podUID="952d5757-28bc-4940-9fa6-4a50ffff6476" Apr 24 14:23:49.049115 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:23:49.049089 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbd73a0c_457f_455c_b2ca_ef248d74efc8.slice/crio-f57ea43193b5ac26137c36cce3cee2b855a8f41225c2e87537cba79fbd72f4dc WatchSource:0}: Error finding container f57ea43193b5ac26137c36cce3cee2b855a8f41225c2e87537cba79fbd72f4dc: Status 404 returned error can't find the container with id f57ea43193b5ac26137c36cce3cee2b855a8f41225c2e87537cba79fbd72f4dc Apr 24 14:23:49.137514 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:49.137272 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-p222m" event={"ID":"fbd73a0c-457f-455c-b2ca-ef248d74efc8","Type":"ContainerStarted","Data":"f57ea43193b5ac26137c36cce3cee2b855a8f41225c2e87537cba79fbd72f4dc"} Apr 24 14:23:50.019157 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:50.018946 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57rkt" Apr 24 14:23:50.019746 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:50.019176 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57rkt" podUID="f33ebad9-63f4-4a25-865f-68c02ee70c85" Apr 24 14:23:50.141405 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:50.141371 2567 generic.go:358] "Generic (PLEG): container finished" podID="04393dd2-c684-4592-bc88-2223fac95a11" containerID="1df1d81ef623b620e676021a07b7e93e8d317a39e29934ea8cd9b79ed279c443" exitCode=0 Apr 24 14:23:50.141584 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:50.141455 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cb58k" event={"ID":"04393dd2-c684-4592-bc88-2223fac95a11","Type":"ContainerDied","Data":"1df1d81ef623b620e676021a07b7e93e8d317a39e29934ea8cd9b79ed279c443"} Apr 24 14:23:50.142858 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:50.142835 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" event={"ID":"1fba2eb1-536b-4442-bf40-5af241dd98fd","Type":"ContainerStarted","Data":"a2d63d66d3cde3f6500768771fe3eca99a58890f194e4531433c9202ea812cfd"} Apr 24 14:23:50.144043 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:50.144025 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-p222m" event={"ID":"fbd73a0c-457f-455c-b2ca-ef248d74efc8","Type":"ContainerStarted","Data":"f8fb1c41720c339691b8d74c799f336d486496f3d642cce19f76f29eacb16f48"} Apr 24 14:23:50.145189 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:50.145166 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7npfb" event={"ID":"a071a258-93d2-4209-8cc4-3e5105208e68","Type":"ContainerStarted","Data":"32a1929622dc2d7233a6a9e4d25d07f592ba5bd7440278988bafb9dbb441840d"} Apr 24 14:23:50.146495 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:50.146473 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-m4w28" event={"ID":"2a44ddf6-7291-4385-a9e5-9e6dd777407e","Type":"ContainerStarted","Data":"84c0d6cfec908f40b143de6c9dd6438c53e1e1f180327063c634eb722825e0f7"} Apr 24 14:23:50.147845 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:50.147825 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vrgvl" event={"ID":"1a01a64d-13d5-4b58-b51d-c8bda8dbefb6","Type":"ContainerStarted","Data":"c40487dd9a7b99a335a84242adccc7f34ed35f1f8dad01a6165408fa63cca328"} Apr 24 14:23:50.149028 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:50.149010 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h5ptb" event={"ID":"b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73","Type":"ContainerStarted","Data":"de2c9d1e2ce2e9fd89e7110639c747e1eb9fbdb71bc96ebc2569c93f9002813c"} Apr 24 14:23:50.151379 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:50.151362 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jd5s_e0bd6039-b2d8-405a-b478-69690078dd73/ovn-acl-logging/0.log" Apr 24 14:23:50.151631 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:50.151615 2567 generic.go:358] "Generic (PLEG): container finished" podID="e0bd6039-b2d8-405a-b478-69690078dd73" containerID="f88fb7e9ed4e5bf52c8e952a9d73def58a196688ee3d6a93fc6ba0ce50792326" exitCode=1 Apr 24 14:23:50.151672 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:50.151642 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" event={"ID":"e0bd6039-b2d8-405a-b478-69690078dd73","Type":"ContainerStarted","Data":"beb9c976820386f8e01b705ba8e373da623a07a542834b113de8d2a5932f675d"} Apr 24 14:23:50.151672 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:50.151656 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" event={"ID":"e0bd6039-b2d8-405a-b478-69690078dd73","Type":"ContainerStarted","Data":"577545139ef3ce1971db68d86b1c8bdfe7ee770ce5be62563bf99aafe335e124"} Apr 24 14:23:50.151672 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:50.151669 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" event={"ID":"e0bd6039-b2d8-405a-b478-69690078dd73","Type":"ContainerStarted","Data":"1264c9c18ad572886490b86b5b84ad8cfa908c03c6be20cf08a0ad317cbeadf5"} Apr 24 14:23:50.151754 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:50.151680 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" event={"ID":"e0bd6039-b2d8-405a-b478-69690078dd73","Type":"ContainerStarted","Data":"38c4a1af522de387b1692d25c7f7d0f0a15c08a1228f0dfcaac090348c5f84c8"} Apr 24 14:23:50.151754 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:50.151688 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" event={"ID":"e0bd6039-b2d8-405a-b478-69690078dd73","Type":"ContainerDied","Data":"f88fb7e9ed4e5bf52c8e952a9d73def58a196688ee3d6a93fc6ba0ce50792326"} Apr 24 14:23:50.151754 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:50.151696 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" event={"ID":"e0bd6039-b2d8-405a-b478-69690078dd73","Type":"ContainerStarted","Data":"7e111baa712e9ff4f7e26c3583cce344c255fb58f2a8a17fd204ee2f198bd656"} Apr 24 14:23:50.168008 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:50.167967 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-vl2zx" podStartSLOduration=3.777410444 podStartE2EDuration="20.167953755s" podCreationTimestamp="2026-04-24 14:23:30 +0000 UTC" firstStartedPulling="2026-04-24 14:23:32.655652311 +0000 UTC m=+3.245805023" lastFinishedPulling="2026-04-24 14:23:49.046195634 +0000 UTC m=+19.636348334" observedRunningTime="2026-04-24 14:23:50.167740463 +0000 UTC m=+20.757893184" watchObservedRunningTime="2026-04-24 14:23:50.167953755 +0000 UTC m=+20.758106475" Apr 24 14:23:50.179442 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:50.179404 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-p222m" podStartSLOduration=7.179391419 podStartE2EDuration="7.179391419s" podCreationTimestamp="2026-04-24 14:23:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:23:50.178953953 +0000 UTC m=+20.769106674" watchObservedRunningTime="2026-04-24 14:23:50.179391419 +0000 UTC m=+20.769544141" Apr 24 14:23:50.191627 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:50.191578 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-vrgvl" podStartSLOduration=3.802515768 podStartE2EDuration="20.191560586s" podCreationTimestamp="2026-04-24 14:23:30 +0000 UTC" firstStartedPulling="2026-04-24 14:23:32.655698489 +0000 UTC m=+3.245851189" lastFinishedPulling="2026-04-24 14:23:49.044743307 +0000 UTC m=+19.634896007" observedRunningTime="2026-04-24 14:23:50.191102166 +0000 UTC m=+20.781254888" watchObservedRunningTime="2026-04-24 14:23:50.191560586 +0000 UTC m=+20.781713307" Apr 24 14:23:50.199611 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:50.199590 2567 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 14:23:50.203333 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:50.203296 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-m4w28" podStartSLOduration=8.302044115 podStartE2EDuration="20.203283662s" podCreationTimestamp="2026-04-24 14:23:30 +0000 UTC" firstStartedPulling="2026-04-24 14:23:32.659214117 +0000 UTC m=+3.249366824" lastFinishedPulling="2026-04-24 14:23:44.560453654 +0000 UTC m=+15.150606371" observedRunningTime="2026-04-24 14:23:50.202997195 +0000 UTC m=+20.793149920" watchObservedRunningTime="2026-04-24 14:23:50.203283662 +0000 UTC m=+20.793436687" Apr 24 14:23:50.218366 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:50.217365 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-h5ptb" podStartSLOduration=3.53747217 podStartE2EDuration="20.217346717s" podCreationTimestamp="2026-04-24 14:23:30 +0000 UTC" firstStartedPulling="2026-04-24 14:23:32.665235833 +0000 UTC m=+3.255388533" lastFinishedPulling="2026-04-24 14:23:49.34511038 +0000 UTC m=+19.935263080" observedRunningTime="2026-04-24 14:23:50.216252064 +0000 UTC m=+20.806404786" watchObservedRunningTime="2026-04-24 14:23:50.217346717 +0000 UTC m=+20.807499438" Apr 24 14:23:50.984642 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:50.984281 2567 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T14:23:50.199604867Z","UUID":"d34e2a21-8490-44e4-826d-21cc34c479ae","Handler":null,"Name":"","Endpoint":""} Apr 24 14:23:50.987186 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:50.987163 2567 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 14:23:50.987358 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:50.987224 2567 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 14:23:51.018352 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:51.018319 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ct9nz" Apr 24 14:23:51.018601 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:51.018507 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ct9nz" podUID="952d5757-28bc-4940-9fa6-4a50ffff6476" Apr 24 14:23:51.155267 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:51.155235 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-m2lt2" event={"ID":"5fc12c1f-0a1b-49b5-b20f-0208c496ba66","Type":"ContainerStarted","Data":"d06395914644120b08df593e0e23cffa7015650ca3b6c6a1e148446b4f6d8366"} Apr 24 14:23:51.157629 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:51.157583 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7npfb" event={"ID":"a071a258-93d2-4209-8cc4-3e5105208e68","Type":"ContainerStarted","Data":"d62614cf69fac783d095c008ee6b4fb59351214051e8adc3248ec692b0bea36a"} Apr 24 14:23:51.166018 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:51.165972 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-m2lt2" podStartSLOduration=4.774447955 podStartE2EDuration="21.165950195s" podCreationTimestamp="2026-04-24 14:23:30 +0000 UTC" firstStartedPulling="2026-04-24 14:23:32.652917403 +0000 UTC m=+3.243070109" lastFinishedPulling="2026-04-24 14:23:49.044419649 +0000 UTC m=+19.634572349" observedRunningTime="2026-04-24 14:23:51.165844241 +0000 UTC m=+21.755996963" watchObservedRunningTime="2026-04-24 14:23:51.165950195 +0000 UTC m=+21.756102919" Apr 24 14:23:52.018730 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:52.018654 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57rkt" Apr 24 14:23:52.019004 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:52.018773 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57rkt" podUID="f33ebad9-63f4-4a25-865f-68c02ee70c85" Apr 24 14:23:52.162829 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:52.162787 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jd5s_e0bd6039-b2d8-405a-b478-69690078dd73/ovn-acl-logging/0.log" Apr 24 14:23:52.163327 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:52.163257 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" event={"ID":"e0bd6039-b2d8-405a-b478-69690078dd73","Type":"ContainerStarted","Data":"1333ccd100fc5c4e539d6902dd1faa9c50f4850903eac191883954b0a4e4999a"} Apr 24 14:23:52.165154 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:52.165127 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7npfb" event={"ID":"a071a258-93d2-4209-8cc4-3e5105208e68","Type":"ContainerStarted","Data":"2b89c73aa664cf1017ab3ba7b9df5ea2d9a675c6114f187fce2e19e48948b687"} Apr 24 14:23:52.558672 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:52.558630 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-vrgvl" Apr 24 14:23:52.559279 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:52.559257 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-vrgvl" Apr 24 14:23:52.571672 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:52.571626 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7npfb" podStartSLOduration=4.207152351 podStartE2EDuration="22.571613941s" podCreationTimestamp="2026-04-24 14:23:30 +0000 UTC" firstStartedPulling="2026-04-24 14:23:32.666262085 +0000 UTC m=+3.256414799" lastFinishedPulling="2026-04-24 14:23:51.030723675 +0000 UTC m=+21.620876389" observedRunningTime="2026-04-24 14:23:52.178964377 +0000 UTC m=+22.769117094" watchObservedRunningTime="2026-04-24 14:23:52.571613941 +0000 UTC m=+23.161766665" Apr 24 14:23:53.018790 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:53.018760 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ct9nz" Apr 24 14:23:53.018984 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:53.018911 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ct9nz" podUID="952d5757-28bc-4940-9fa6-4a50ffff6476" Apr 24 14:23:54.019215 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:54.019135 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57rkt" Apr 24 14:23:54.019772 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:54.019260 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57rkt" podUID="f33ebad9-63f4-4a25-865f-68c02ee70c85" Apr 24 14:23:55.018794 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:55.018619 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ct9nz" Apr 24 14:23:55.018963 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:55.018909 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ct9nz" podUID="952d5757-28bc-4940-9fa6-4a50ffff6476" Apr 24 14:23:55.172914 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:55.172887 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jd5s_e0bd6039-b2d8-405a-b478-69690078dd73/ovn-acl-logging/0.log" Apr 24 14:23:55.173275 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:55.173225 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" event={"ID":"e0bd6039-b2d8-405a-b478-69690078dd73","Type":"ContainerStarted","Data":"f5a1575024f51bb51fdecfcd5d560acc006ca142cb97d470ba6e68e18ae6e361"} Apr 24 14:23:55.173540 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:55.173517 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:55.173733 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:55.173716 2567 scope.go:117] "RemoveContainer" containerID="f88fb7e9ed4e5bf52c8e952a9d73def58a196688ee3d6a93fc6ba0ce50792326" Apr 24 14:23:55.175125 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:55.175104 2567 generic.go:358] "Generic (PLEG): container finished" podID="04393dd2-c684-4592-bc88-2223fac95a11" containerID="1e9c1e2b0b87f8d7fdf57455f83b2d6b73df33be96bedc3ea01f67ac06d0314b" exitCode=0 Apr 24 14:23:55.175185 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:55.175140 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cb58k" event={"ID":"04393dd2-c684-4592-bc88-2223fac95a11","Type":"ContainerDied","Data":"1e9c1e2b0b87f8d7fdf57455f83b2d6b73df33be96bedc3ea01f67ac06d0314b"} Apr 24 14:23:55.188521 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:55.188500 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:56.019314 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:56.019027 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57rkt" Apr 24 14:23:56.019314 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:56.019156 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57rkt" podUID="f33ebad9-63f4-4a25-865f-68c02ee70c85" Apr 24 14:23:56.179650 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:56.179478 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jd5s_e0bd6039-b2d8-405a-b478-69690078dd73/ovn-acl-logging/0.log" Apr 24 14:23:56.180014 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:56.179963 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" event={"ID":"e0bd6039-b2d8-405a-b478-69690078dd73","Type":"ContainerStarted","Data":"f22bf7ce2d1eb97aac27a853b3ebd169d803d7478cbd6008369f08f911bdfafc"} Apr 24 14:23:56.180182 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:56.180158 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:56.180294 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:56.180189 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:56.181869 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:56.181848 2567 generic.go:358] "Generic (PLEG): container finished" podID="04393dd2-c684-4592-bc88-2223fac95a11" containerID="e0b787cfc7d010cb5b4284651f0deff3a7ee40425e7f63fb571d858832dc05c8" exitCode=0 Apr 24 14:23:56.181955 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:56.181883 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cb58k" event={"ID":"04393dd2-c684-4592-bc88-2223fac95a11","Type":"ContainerDied","Data":"e0b787cfc7d010cb5b4284651f0deff3a7ee40425e7f63fb571d858832dc05c8"} Apr 24 14:23:56.194056 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:56.194041 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:23:56.204927 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:56.204861 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" podStartSLOduration=9.767801323 podStartE2EDuration="26.204849933s" podCreationTimestamp="2026-04-24 14:23:30 +0000 UTC" firstStartedPulling="2026-04-24 14:23:32.661370443 +0000 UTC m=+3.251523146" lastFinishedPulling="2026-04-24 14:23:49.098419056 +0000 UTC m=+19.688571756" observedRunningTime="2026-04-24 14:23:56.203435926 +0000 UTC m=+26.793588647" watchObservedRunningTime="2026-04-24 14:23:56.204849933 +0000 UTC m=+26.795002694" Apr 24 14:23:56.219378 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:56.219357 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-57rkt"] Apr 24 14:23:56.219459 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:56.219438 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57rkt" Apr 24 14:23:56.219528 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:56.219512 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57rkt" podUID="f33ebad9-63f4-4a25-865f-68c02ee70c85" Apr 24 14:23:56.222213 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:56.222187 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ct9nz"] Apr 24 14:23:56.222320 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:56.222288 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ct9nz" Apr 24 14:23:56.222377 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:56.222360 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ct9nz" podUID="952d5757-28bc-4940-9fa6-4a50ffff6476" Apr 24 14:23:57.185837 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:57.185781 2567 generic.go:358] "Generic (PLEG): container finished" podID="04393dd2-c684-4592-bc88-2223fac95a11" containerID="8cf2a478d7ff7d64cb320df3f5926d1eaa692afe1c5111948d3c3472ccd0a66d" exitCode=0 Apr 24 14:23:57.186217 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:57.185860 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cb58k" event={"ID":"04393dd2-c684-4592-bc88-2223fac95a11","Type":"ContainerDied","Data":"8cf2a478d7ff7d64cb320df3f5926d1eaa692afe1c5111948d3c3472ccd0a66d"} Apr 24 14:23:58.019053 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:58.019022 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57rkt" Apr 24 14:23:58.019206 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:23:58.019032 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ct9nz" Apr 24 14:23:58.019206 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:58.019140 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57rkt" podUID="f33ebad9-63f4-4a25-865f-68c02ee70c85" Apr 24 14:23:58.019305 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:23:58.019243 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ct9nz" podUID="952d5757-28bc-4940-9fa6-4a50ffff6476" Apr 24 14:24:00.019453 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:00.019422 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ct9nz" Apr 24 14:24:00.020250 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:00.019550 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ct9nz" podUID="952d5757-28bc-4940-9fa6-4a50ffff6476" Apr 24 14:24:00.020250 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:00.019603 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57rkt" Apr 24 14:24:00.020250 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:00.019694 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57rkt" podUID="f33ebad9-63f4-4a25-865f-68c02ee70c85" Apr 24 14:24:00.508489 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:00.508454 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-6gs2p"] Apr 24 14:24:00.543701 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:00.543657 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6gs2p"] Apr 24 14:24:00.543877 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:00.543798 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6gs2p" Apr 24 14:24:00.543922 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:00.543894 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6gs2p" podUID="9c15bbb5-ef9b-4df0-b792-073931b97ea8" Apr 24 14:24:00.642576 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:00.642541 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9c15bbb5-ef9b-4df0-b792-073931b97ea8-kubelet-config\") pod \"global-pull-secret-syncer-6gs2p\" (UID: \"9c15bbb5-ef9b-4df0-b792-073931b97ea8\") " pod="kube-system/global-pull-secret-syncer-6gs2p" Apr 24 14:24:00.642734 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:00.642630 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9c15bbb5-ef9b-4df0-b792-073931b97ea8-dbus\") pod \"global-pull-secret-syncer-6gs2p\" (UID: \"9c15bbb5-ef9b-4df0-b792-073931b97ea8\") " pod="kube-system/global-pull-secret-syncer-6gs2p" Apr 24 14:24:00.642734 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:00.642672 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9c15bbb5-ef9b-4df0-b792-073931b97ea8-original-pull-secret\") pod \"global-pull-secret-syncer-6gs2p\" (UID: \"9c15bbb5-ef9b-4df0-b792-073931b97ea8\") " pod="kube-system/global-pull-secret-syncer-6gs2p" Apr 24 14:24:00.743138 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:00.743100 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9c15bbb5-ef9b-4df0-b792-073931b97ea8-kubelet-config\") pod \"global-pull-secret-syncer-6gs2p\" (UID: \"9c15bbb5-ef9b-4df0-b792-073931b97ea8\") " pod="kube-system/global-pull-secret-syncer-6gs2p" Apr 24 14:24:00.743321 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:00.743179 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9c15bbb5-ef9b-4df0-b792-073931b97ea8-dbus\") pod \"global-pull-secret-syncer-6gs2p\" (UID: \"9c15bbb5-ef9b-4df0-b792-073931b97ea8\") " pod="kube-system/global-pull-secret-syncer-6gs2p" Apr 24 14:24:00.743321 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:00.743213 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9c15bbb5-ef9b-4df0-b792-073931b97ea8-original-pull-secret\") pod \"global-pull-secret-syncer-6gs2p\" (UID: \"9c15bbb5-ef9b-4df0-b792-073931b97ea8\") " pod="kube-system/global-pull-secret-syncer-6gs2p" Apr 24 14:24:00.743321 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:00.743225 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9c15bbb5-ef9b-4df0-b792-073931b97ea8-kubelet-config\") pod \"global-pull-secret-syncer-6gs2p\" (UID: \"9c15bbb5-ef9b-4df0-b792-073931b97ea8\") " pod="kube-system/global-pull-secret-syncer-6gs2p" Apr 24 14:24:00.743458 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:00.743371 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:00.743458 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:00.743387 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9c15bbb5-ef9b-4df0-b792-073931b97ea8-dbus\") pod \"global-pull-secret-syncer-6gs2p\" (UID: \"9c15bbb5-ef9b-4df0-b792-073931b97ea8\") " pod="kube-system/global-pull-secret-syncer-6gs2p" Apr 24 14:24:00.743458 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:00.743434 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c15bbb5-ef9b-4df0-b792-073931b97ea8-original-pull-secret podName:9c15bbb5-ef9b-4df0-b792-073931b97ea8 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:01.2434147 +0000 UTC m=+31.833567405 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9c15bbb5-ef9b-4df0-b792-073931b97ea8-original-pull-secret") pod "global-pull-secret-syncer-6gs2p" (UID: "9c15bbb5-ef9b-4df0-b792-073931b97ea8") : object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:00.969918 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:00.969882 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-vrgvl" Apr 24 14:24:00.970072 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:00.970033 2567 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 14:24:00.971157 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:00.971129 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-vrgvl" Apr 24 14:24:01.248848 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:01.248735 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9c15bbb5-ef9b-4df0-b792-073931b97ea8-original-pull-secret\") pod \"global-pull-secret-syncer-6gs2p\" (UID: \"9c15bbb5-ef9b-4df0-b792-073931b97ea8\") " pod="kube-system/global-pull-secret-syncer-6gs2p" Apr 24 14:24:01.249373 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:01.248869 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:01.249373 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:01.248931 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c15bbb5-ef9b-4df0-b792-073931b97ea8-original-pull-secret podName:9c15bbb5-ef9b-4df0-b792-073931b97ea8 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:02.24891511 +0000 UTC m=+32.839067810 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9c15bbb5-ef9b-4df0-b792-073931b97ea8-original-pull-secret") pod "global-pull-secret-syncer-6gs2p" (UID: "9c15bbb5-ef9b-4df0-b792-073931b97ea8") : object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:02.018388 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.018301 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6gs2p" Apr 24 14:24:02.018388 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.018354 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ct9nz" Apr 24 14:24:02.018581 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:02.018444 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6gs2p" podUID="9c15bbb5-ef9b-4df0-b792-073931b97ea8" Apr 24 14:24:02.018581 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:02.018522 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ct9nz" podUID="952d5757-28bc-4940-9fa6-4a50ffff6476" Apr 24 14:24:02.018581 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.018549 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57rkt" Apr 24 14:24:02.018683 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:02.018642 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57rkt" podUID="f33ebad9-63f4-4a25-865f-68c02ee70c85" Apr 24 14:24:02.248748 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.248718 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-36.ec2.internal" event="NodeReady" Apr 24 14:24:02.248907 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.248886 2567 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 14:24:02.256675 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.256645 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9c15bbb5-ef9b-4df0-b792-073931b97ea8-original-pull-secret\") pod \"global-pull-secret-syncer-6gs2p\" (UID: \"9c15bbb5-ef9b-4df0-b792-073931b97ea8\") " pod="kube-system/global-pull-secret-syncer-6gs2p" Apr 24 14:24:02.256832 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:02.256761 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:02.256910 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:02.256852 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c15bbb5-ef9b-4df0-b792-073931b97ea8-original-pull-secret podName:9c15bbb5-ef9b-4df0-b792-073931b97ea8 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:04.256830776 +0000 UTC m=+34.846983477 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9c15bbb5-ef9b-4df0-b792-073931b97ea8-original-pull-secret") pod "global-pull-secret-syncer-6gs2p" (UID: "9c15bbb5-ef9b-4df0-b792-073931b97ea8") : object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:02.281984 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.281897 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-l59rf"] Apr 24 14:24:02.319886 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.319845 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-55d6cb6cf9-tqwbb"] Apr 24 14:24:02.323671 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.320260 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l59rf" Apr 24 14:24:02.323671 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.322635 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 14:24:02.323671 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.323205 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 24 14:24:02.323837 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.323828 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-vhdj2\"" Apr 24 14:24:02.324066 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.324047 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 14:24:02.324641 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.324621 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 24 14:24:02.342896 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.342871 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xw99m"] Apr 24 14:24:02.343038 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.343016 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" Apr 24 14:24:02.345119 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.344980 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 14:24:02.345315 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.345296 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 14:24:02.346001 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.345978 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-vfj9f\"" Apr 24 14:24:02.346502 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.346482 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 14:24:02.352253 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.352139 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 14:24:02.365617 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.365597 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lq2ln"] Apr 24 14:24:02.365752 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.365736 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xw99m" Apr 24 14:24:02.367460 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.367436 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 24 14:24:02.367673 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.367643 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-fz9k2\"" Apr 24 14:24:02.367777 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.367718 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 24 14:24:02.385118 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.385094 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-7979n"] Apr 24 14:24:02.385256 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.385152 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lq2ln" Apr 24 14:24:02.387149 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.387127 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 24 14:24:02.387372 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.387346 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 24 14:24:02.387466 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.387410 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 24 14:24:02.387529 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.387347 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-8bls2\"" Apr 24 14:24:02.396651 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.396629 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-l59rf"] Apr 24 14:24:02.396651 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.396656 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-qcxs6"] Apr 24 14:24:02.396842 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.396782 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-7979n" Apr 24 14:24:02.399801 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.399784 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 24 14:24:02.399932 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.399853 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 24 14:24:02.399932 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.399861 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 24 14:24:02.400038 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.399801 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-vfztj\"" Apr 24 14:24:02.400505 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.400488 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 24 14:24:02.404976 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.404959 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 24 14:24:02.410258 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.410237 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-28tw4"] Apr 24 14:24:02.410359 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.410315 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qcxs6" Apr 24 14:24:02.412169 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.412149 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 14:24:02.412169 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.412167 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vrfmr\"" Apr 24 14:24:02.412322 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.412261 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 14:24:02.422129 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.422107 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qtdqf"] Apr 24 14:24:02.422257 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.422241 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-28tw4" Apr 24 14:24:02.424267 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.424244 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-bgwdb\"" Apr 24 14:24:02.424267 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.424256 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 24 14:24:02.424267 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.424256 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 24 14:24:02.424649 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.424630 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 24 14:24:02.424740 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.424718 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 24 14:24:02.434795 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.434772 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-rt2p5"] Apr 24 14:24:02.434987 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.434971 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qtdqf" Apr 24 14:24:02.437111 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.437066 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 24 14:24:02.437111 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.437070 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 24 14:24:02.437111 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.437106 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-47jqf\"" Apr 24 14:24:02.437388 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.437078 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 24 14:24:02.437388 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.437191 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 24 14:24:02.447859 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.447835 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6dd6f99597-fnwrd"] Apr 24 14:24:02.448017 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.447995 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rt2p5" Apr 24 14:24:02.450088 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.450059 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 14:24:02.450194 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.450089 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-hr4gz\"" Apr 24 14:24:02.450194 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.450127 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 14:24:02.457906 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.457880 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnnkz\" (UniqueName: \"kubernetes.io/projected/55813995-c655-4039-9151-23a5a7023a30-kube-api-access-rnnkz\") pod \"cluster-samples-operator-6dc5bdb6b4-lq2ln\" (UID: \"55813995-c655-4039-9151-23a5a7023a30\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lq2ln" Apr 24 14:24:02.458011 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.457919 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cbc7c460-1a4a-47be-8d93-efd98ee46239-trusted-ca\") pod \"image-registry-55d6cb6cf9-tqwbb\" (UID: \"cbc7c460-1a4a-47be-8d93-efd98ee46239\") " pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" Apr 24 14:24:02.458011 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.457950 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cbc7c460-1a4a-47be-8d93-efd98ee46239-installation-pull-secrets\") pod \"image-registry-55d6cb6cf9-tqwbb\" (UID: \"cbc7c460-1a4a-47be-8d93-efd98ee46239\") " pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" Apr 24 14:24:02.458011 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.457975 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv69l\" (UniqueName: \"kubernetes.io/projected/cbc7c460-1a4a-47be-8d93-efd98ee46239-kube-api-access-xv69l\") pod \"image-registry-55d6cb6cf9-tqwbb\" (UID: \"cbc7c460-1a4a-47be-8d93-efd98ee46239\") " pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" Apr 24 14:24:02.458011 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.458004 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d7576e0a-caf3-4817-8250-2b6570598ac0-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-l59rf\" (UID: \"d7576e0a-caf3-4817-8250-2b6570598ac0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l59rf" Apr 24 14:24:02.458167 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.458020 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-wsvfh"] Apr 24 14:24:02.458167 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.458062 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d7576e0a-caf3-4817-8250-2b6570598ac0-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-l59rf\" (UID: \"d7576e0a-caf3-4817-8250-2b6570598ac0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l59rf" Apr 24 14:24:02.458167 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.458089 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2tft\" (UniqueName: \"kubernetes.io/projected/d7576e0a-caf3-4817-8250-2b6570598ac0-kube-api-access-b2tft\") pod \"cluster-monitoring-operator-75587bd455-l59rf\" (UID: \"d7576e0a-caf3-4817-8250-2b6570598ac0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l59rf" Apr 24 14:24:02.458167 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.458111 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cbc7c460-1a4a-47be-8d93-efd98ee46239-ca-trust-extracted\") pod \"image-registry-55d6cb6cf9-tqwbb\" (UID: \"cbc7c460-1a4a-47be-8d93-efd98ee46239\") " pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" Apr 24 14:24:02.458310 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.458183 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cbc7c460-1a4a-47be-8d93-efd98ee46239-registry-certificates\") pod \"image-registry-55d6cb6cf9-tqwbb\" (UID: \"cbc7c460-1a4a-47be-8d93-efd98ee46239\") " pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" Apr 24 14:24:02.458310 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.458235 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cbc7c460-1a4a-47be-8d93-efd98ee46239-bound-sa-token\") pod \"image-registry-55d6cb6cf9-tqwbb\" (UID: \"cbc7c460-1a4a-47be-8d93-efd98ee46239\") " pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" Apr 24 14:24:02.458310 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.458269 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms64c\" (UniqueName: \"kubernetes.io/projected/0f90d0c4-7a9d-46c4-bb3a-d2d928bb1412-kube-api-access-ms64c\") pod \"volume-data-source-validator-7c6cbb6c87-xw99m\" (UID: \"0f90d0c4-7a9d-46c4-bb3a-d2d928bb1412\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xw99m" Apr 24 14:24:02.458443 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.458320 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cbc7c460-1a4a-47be-8d93-efd98ee46239-image-registry-private-configuration\") pod \"image-registry-55d6cb6cf9-tqwbb\" (UID: \"cbc7c460-1a4a-47be-8d93-efd98ee46239\") " pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" Apr 24 14:24:02.458443 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.458368 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cbc7c460-1a4a-47be-8d93-efd98ee46239-registry-tls\") pod \"image-registry-55d6cb6cf9-tqwbb\" (UID: \"cbc7c460-1a4a-47be-8d93-efd98ee46239\") " pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" Apr 24 14:24:02.458443 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.458421 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/55813995-c655-4039-9151-23a5a7023a30-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lq2ln\" (UID: \"55813995-c655-4039-9151-23a5a7023a30\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lq2ln" Apr 24 14:24:02.458443 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.458435 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6dd6f99597-fnwrd" Apr 24 14:24:02.460067 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.460049 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 14:24:02.460412 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.460390 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 24 14:24:02.460543 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.460430 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 14:24:02.460543 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.460441 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 14:24:02.460668 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.460480 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-hthjm\"" Apr 24 14:24:02.472698 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.472678 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cdcd8c5b4-fff9m"] Apr 24 14:24:02.472854 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.472841 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wsvfh" Apr 24 14:24:02.474711 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.474686 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 24 14:24:02.474842 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.474744 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-gjz9f\"" Apr 24 14:24:02.474842 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.474748 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 24 14:24:02.493997 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.493957 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79d47f7669-4ff6w"] Apr 24 14:24:02.494173 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.494113 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cdcd8c5b4-fff9m" Apr 24 14:24:02.496281 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.496259 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 24 14:24:02.511795 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.511767 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lq2ln"] Apr 24 14:24:02.511795 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.511796 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-55d6cb6cf9-tqwbb"] Apr 24 14:24:02.512026 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.511824 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-95gsb"] Apr 24 14:24:02.512026 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.511918 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79d47f7669-4ff6w" Apr 24 14:24:02.514103 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.514066 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 24 14:24:02.514103 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.514090 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 24 14:24:02.514271 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.514095 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 24 14:24:02.514271 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.514096 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 24 14:24:02.526464 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.526434 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xw99m"] Apr 24 14:24:02.526464 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.526461 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qtdqf"] Apr 24 14:24:02.526621 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.526473 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qcxs6"] Apr 24 14:24:02.526621 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.526483 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-rt2p5"] Apr 24 14:24:02.526621 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.526601 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-95gsb" Apr 24 14:24:02.526751 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.526639 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cdcd8c5b4-fff9m"] Apr 24 14:24:02.526751 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.526679 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-28tw4"] Apr 24 14:24:02.526751 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.526692 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-7979n"] Apr 24 14:24:02.526751 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.526711 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-95gsb"] Apr 24 14:24:02.526751 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.526721 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-wsvfh"] Apr 24 14:24:02.526751 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.526731 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6dd6f99597-fnwrd"] Apr 24 14:24:02.526751 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.526741 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79d47f7669-4ff6w"] Apr 24 14:24:02.528945 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.528921 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 14:24:02.529062 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.528921 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 14:24:02.529062 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.529017 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 14:24:02.529172 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.529061 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9tl5g\"" Apr 24 14:24:02.559513 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.559481 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d7576e0a-caf3-4817-8250-2b6570598ac0-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-l59rf\" (UID: \"d7576e0a-caf3-4817-8250-2b6570598ac0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l59rf" Apr 24 14:24:02.559672 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.559522 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cbc7c460-1a4a-47be-8d93-efd98ee46239-ca-trust-extracted\") pod \"image-registry-55d6cb6cf9-tqwbb\" (UID: \"cbc7c460-1a4a-47be-8d93-efd98ee46239\") " pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" Apr 24 14:24:02.559672 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.559548 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cbc7c460-1a4a-47be-8d93-efd98ee46239-registry-certificates\") pod \"image-registry-55d6cb6cf9-tqwbb\" (UID: \"cbc7c460-1a4a-47be-8d93-efd98ee46239\") " pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" Apr 24 14:24:02.559672 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.559582 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fee20684-697d-4937-9bf8-9549ab5442bf-trusted-ca\") pod \"console-operator-9d4b6777b-7979n\" (UID: \"fee20684-697d-4937-9bf8-9549ab5442bf\") " pod="openshift-console-operator/console-operator-9d4b6777b-7979n" Apr 24 14:24:02.559672 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.559607 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5jj7\" (UniqueName: \"kubernetes.io/projected/447bb5cc-c01c-4312-b625-551aa4e765b3-kube-api-access-q5jj7\") pod \"service-ca-operator-d6fc45fc5-qtdqf\" (UID: \"447bb5cc-c01c-4312-b625-551aa4e765b3\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qtdqf" Apr 24 14:24:02.559672 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:02.559633 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 14:24:02.559845 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:02.559704 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7576e0a-caf3-4817-8250-2b6570598ac0-cluster-monitoring-operator-tls podName:d7576e0a-caf3-4817-8250-2b6570598ac0 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:03.05968483 +0000 UTC m=+33.649837540 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d7576e0a-caf3-4817-8250-2b6570598ac0-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-l59rf" (UID: "d7576e0a-caf3-4817-8250-2b6570598ac0") : secret "cluster-monitoring-operator-tls" not found Apr 24 14:24:02.559845 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.559633 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2tft\" (UniqueName: \"kubernetes.io/projected/d7576e0a-caf3-4817-8250-2b6570598ac0-kube-api-access-b2tft\") pod \"cluster-monitoring-operator-75587bd455-l59rf\" (UID: \"d7576e0a-caf3-4817-8250-2b6570598ac0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l59rf" Apr 24 14:24:02.559921 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.559850 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3a2bcc78-148a-4024-85bf-f0bfdf9c6f93-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6dd6f99597-fnwrd\" (UID: \"3a2bcc78-148a-4024-85bf-f0bfdf9c6f93\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6dd6f99597-fnwrd" Apr 24 14:24:02.559921 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.559868 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cbc7c460-1a4a-47be-8d93-efd98ee46239-ca-trust-extracted\") pod \"image-registry-55d6cb6cf9-tqwbb\" (UID: \"cbc7c460-1a4a-47be-8d93-efd98ee46239\") " pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" Apr 24 14:24:02.559921 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.559882 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ms64c\" (UniqueName: \"kubernetes.io/projected/0f90d0c4-7a9d-46c4-bb3a-d2d928bb1412-kube-api-access-ms64c\") pod \"volume-data-source-validator-7c6cbb6c87-xw99m\" (UID: \"0f90d0c4-7a9d-46c4-bb3a-d2d928bb1412\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xw99m" Apr 24 14:24:02.559921 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.559908 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wq4m8\" (UniqueName: \"kubernetes.io/projected/3a2bcc78-148a-4024-85bf-f0bfdf9c6f93-kube-api-access-wq4m8\") pod \"managed-serviceaccount-addon-agent-6dd6f99597-fnwrd\" (UID: \"3a2bcc78-148a-4024-85bf-f0bfdf9c6f93\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6dd6f99597-fnwrd" Apr 24 14:24:02.560882 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.560459 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cbc7c460-1a4a-47be-8d93-efd98ee46239-image-registry-private-configuration\") pod \"image-registry-55d6cb6cf9-tqwbb\" (UID: \"cbc7c460-1a4a-47be-8d93-efd98ee46239\") " pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" Apr 24 14:24:02.560882 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.560522 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gxzq\" (UniqueName: \"kubernetes.io/projected/743626cc-db6b-4ae5-a8dc-cceaa1cb8be0-kube-api-access-4gxzq\") pod \"dns-default-qcxs6\" (UID: \"743626cc-db6b-4ae5-a8dc-cceaa1cb8be0\") " pod="openshift-dns/dns-default-qcxs6" Apr 24 14:24:02.560882 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.560562 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rnnkz\" (UniqueName: \"kubernetes.io/projected/55813995-c655-4039-9151-23a5a7023a30-kube-api-access-rnnkz\") pod \"cluster-samples-operator-6dc5bdb6b4-lq2ln\" (UID: \"55813995-c655-4039-9151-23a5a7023a30\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lq2ln" Apr 24 14:24:02.561093 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.560940 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cbc7c460-1a4a-47be-8d93-efd98ee46239-registry-certificates\") pod \"image-registry-55d6cb6cf9-tqwbb\" (UID: \"cbc7c460-1a4a-47be-8d93-efd98ee46239\") " pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" Apr 24 14:24:02.561093 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.560960 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efd840e5-21ca-4b28-993c-7e8bd0b9a822-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-28tw4\" (UID: \"efd840e5-21ca-4b28-993c-7e8bd0b9a822\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-28tw4" Apr 24 14:24:02.561093 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.560999 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cbc7c460-1a4a-47be-8d93-efd98ee46239-installation-pull-secrets\") pod \"image-registry-55d6cb6cf9-tqwbb\" (UID: \"cbc7c460-1a4a-47be-8d93-efd98ee46239\") " pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" Apr 24 14:24:02.561093 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.561026 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/743626cc-db6b-4ae5-a8dc-cceaa1cb8be0-config-volume\") pod \"dns-default-qcxs6\" (UID: \"743626cc-db6b-4ae5-a8dc-cceaa1cb8be0\") " pod="openshift-dns/dns-default-qcxs6" Apr 24 14:24:02.561093 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.561083 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/743626cc-db6b-4ae5-a8dc-cceaa1cb8be0-tmp-dir\") pod \"dns-default-qcxs6\" (UID: \"743626cc-db6b-4ae5-a8dc-cceaa1cb8be0\") " pod="openshift-dns/dns-default-qcxs6" Apr 24 14:24:02.561290 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.561123 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/447bb5cc-c01c-4312-b625-551aa4e765b3-serving-cert\") pod \"service-ca-operator-d6fc45fc5-qtdqf\" (UID: \"447bb5cc-c01c-4312-b625-551aa4e765b3\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qtdqf" Apr 24 14:24:02.561290 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.561158 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whvfg\" (UniqueName: \"kubernetes.io/projected/efd840e5-21ca-4b28-993c-7e8bd0b9a822-kube-api-access-whvfg\") pod \"kube-storage-version-migrator-operator-6769c5d45-28tw4\" (UID: \"efd840e5-21ca-4b28-993c-7e8bd0b9a822\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-28tw4" Apr 24 14:24:02.561290 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.561217 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g776v\" (UniqueName: \"kubernetes.io/projected/5b2b1cb4-7903-4e07-aa4d-25298b638889-kube-api-access-g776v\") pod \"network-check-source-8894fc9bd-rt2p5\" (UID: \"5b2b1cb4-7903-4e07-aa4d-25298b638889\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rt2p5" Apr 24 14:24:02.561290 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.561263 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cbc7c460-1a4a-47be-8d93-efd98ee46239-registry-tls\") pod \"image-registry-55d6cb6cf9-tqwbb\" (UID: \"cbc7c460-1a4a-47be-8d93-efd98ee46239\") " pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" Apr 24 14:24:02.561468 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.561296 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d63229a9-def3-4d90-95d7-748eed4531eb-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-wsvfh\" (UID: \"d63229a9-def3-4d90-95d7-748eed4531eb\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wsvfh" Apr 24 14:24:02.561468 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.561371 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d63229a9-def3-4d90-95d7-748eed4531eb-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wsvfh\" (UID: \"d63229a9-def3-4d90-95d7-748eed4531eb\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wsvfh" Apr 24 14:24:02.561468 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.561430 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cbc7c460-1a4a-47be-8d93-efd98ee46239-bound-sa-token\") pod \"image-registry-55d6cb6cf9-tqwbb\" (UID: \"cbc7c460-1a4a-47be-8d93-efd98ee46239\") " pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" Apr 24 14:24:02.561604 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.561470 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fee20684-697d-4937-9bf8-9549ab5442bf-serving-cert\") pod \"console-operator-9d4b6777b-7979n\" (UID: \"fee20684-697d-4937-9bf8-9549ab5442bf\") " pod="openshift-console-operator/console-operator-9d4b6777b-7979n" Apr 24 14:24:02.561604 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.561516 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efd840e5-21ca-4b28-993c-7e8bd0b9a822-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-28tw4\" (UID: \"efd840e5-21ca-4b28-993c-7e8bd0b9a822\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-28tw4" Apr 24 14:24:02.561604 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.561547 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/743626cc-db6b-4ae5-a8dc-cceaa1cb8be0-metrics-tls\") pod \"dns-default-qcxs6\" (UID: \"743626cc-db6b-4ae5-a8dc-cceaa1cb8be0\") " pod="openshift-dns/dns-default-qcxs6" Apr 24 14:24:02.561604 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.561585 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/55813995-c655-4039-9151-23a5a7023a30-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lq2ln\" (UID: \"55813995-c655-4039-9151-23a5a7023a30\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lq2ln" Apr 24 14:24:02.561785 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.561621 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cbc7c460-1a4a-47be-8d93-efd98ee46239-trusted-ca\") pod \"image-registry-55d6cb6cf9-tqwbb\" (UID: \"cbc7c460-1a4a-47be-8d93-efd98ee46239\") " pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" Apr 24 14:24:02.561785 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.561655 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-278jh\" (UniqueName: \"kubernetes.io/projected/fee20684-697d-4937-9bf8-9549ab5442bf-kube-api-access-278jh\") pod \"console-operator-9d4b6777b-7979n\" (UID: \"fee20684-697d-4937-9bf8-9549ab5442bf\") " pod="openshift-console-operator/console-operator-9d4b6777b-7979n" Apr 24 14:24:02.561785 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.561684 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/447bb5cc-c01c-4312-b625-551aa4e765b3-config\") pod \"service-ca-operator-d6fc45fc5-qtdqf\" (UID: \"447bb5cc-c01c-4312-b625-551aa4e765b3\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qtdqf" Apr 24 14:24:02.561785 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.561716 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xv69l\" (UniqueName: \"kubernetes.io/projected/cbc7c460-1a4a-47be-8d93-efd98ee46239-kube-api-access-xv69l\") pod \"image-registry-55d6cb6cf9-tqwbb\" (UID: \"cbc7c460-1a4a-47be-8d93-efd98ee46239\") " pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" Apr 24 14:24:02.561785 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.561749 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fee20684-697d-4937-9bf8-9549ab5442bf-config\") pod \"console-operator-9d4b6777b-7979n\" (UID: \"fee20684-697d-4937-9bf8-9549ab5442bf\") " pod="openshift-console-operator/console-operator-9d4b6777b-7979n" Apr 24 14:24:02.561785 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.561780 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d7576e0a-caf3-4817-8250-2b6570598ac0-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-l59rf\" (UID: \"d7576e0a-caf3-4817-8250-2b6570598ac0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l59rf" Apr 24 14:24:02.563671 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.562485 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d7576e0a-caf3-4817-8250-2b6570598ac0-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-l59rf\" (UID: \"d7576e0a-caf3-4817-8250-2b6570598ac0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l59rf" Apr 24 14:24:02.563671 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.563300 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cbc7c460-1a4a-47be-8d93-efd98ee46239-trusted-ca\") pod \"image-registry-55d6cb6cf9-tqwbb\" (UID: \"cbc7c460-1a4a-47be-8d93-efd98ee46239\") " pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" Apr 24 14:24:02.563671 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:02.563392 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 14:24:02.564637 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:02.563927 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 14:24:02.564637 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:02.563943 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55d6cb6cf9-tqwbb: secret "image-registry-tls" not found Apr 24 14:24:02.564637 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:02.564001 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cbc7c460-1a4a-47be-8d93-efd98ee46239-registry-tls podName:cbc7c460-1a4a-47be-8d93-efd98ee46239 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:03.063984415 +0000 UTC m=+33.654137117 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cbc7c460-1a4a-47be-8d93-efd98ee46239-registry-tls") pod "image-registry-55d6cb6cf9-tqwbb" (UID: "cbc7c460-1a4a-47be-8d93-efd98ee46239") : secret "image-registry-tls" not found Apr 24 14:24:02.564637 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:02.564063 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55813995-c655-4039-9151-23a5a7023a30-samples-operator-tls podName:55813995-c655-4039-9151-23a5a7023a30 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:03.064053136 +0000 UTC m=+33.654205840 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/55813995-c655-4039-9151-23a5a7023a30-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-lq2ln" (UID: "55813995-c655-4039-9151-23a5a7023a30") : secret "samples-operator-tls" not found Apr 24 14:24:02.566336 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.566315 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cbc7c460-1a4a-47be-8d93-efd98ee46239-image-registry-private-configuration\") pod \"image-registry-55d6cb6cf9-tqwbb\" (UID: \"cbc7c460-1a4a-47be-8d93-efd98ee46239\") " pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" Apr 24 14:24:02.572462 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.572356 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms64c\" (UniqueName: \"kubernetes.io/projected/0f90d0c4-7a9d-46c4-bb3a-d2d928bb1412-kube-api-access-ms64c\") pod \"volume-data-source-validator-7c6cbb6c87-xw99m\" (UID: \"0f90d0c4-7a9d-46c4-bb3a-d2d928bb1412\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xw99m" Apr 24 14:24:02.573021 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.572973 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2tft\" (UniqueName: \"kubernetes.io/projected/d7576e0a-caf3-4817-8250-2b6570598ac0-kube-api-access-b2tft\") pod \"cluster-monitoring-operator-75587bd455-l59rf\" (UID: \"d7576e0a-caf3-4817-8250-2b6570598ac0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l59rf" Apr 24 14:24:02.573175 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.573149 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnnkz\" (UniqueName: \"kubernetes.io/projected/55813995-c655-4039-9151-23a5a7023a30-kube-api-access-rnnkz\") pod \"cluster-samples-operator-6dc5bdb6b4-lq2ln\" (UID: \"55813995-c655-4039-9151-23a5a7023a30\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lq2ln" Apr 24 14:24:02.573388 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.573259 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv69l\" (UniqueName: \"kubernetes.io/projected/cbc7c460-1a4a-47be-8d93-efd98ee46239-kube-api-access-xv69l\") pod \"image-registry-55d6cb6cf9-tqwbb\" (UID: \"cbc7c460-1a4a-47be-8d93-efd98ee46239\") " pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" Apr 24 14:24:02.574100 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.574077 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cbc7c460-1a4a-47be-8d93-efd98ee46239-bound-sa-token\") pod \"image-registry-55d6cb6cf9-tqwbb\" (UID: \"cbc7c460-1a4a-47be-8d93-efd98ee46239\") " pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" Apr 24 14:24:02.574753 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.574730 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cbc7c460-1a4a-47be-8d93-efd98ee46239-installation-pull-secrets\") pod \"image-registry-55d6cb6cf9-tqwbb\" (UID: \"cbc7c460-1a4a-47be-8d93-efd98ee46239\") " pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" Apr 24 14:24:02.662211 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.662176 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gxzq\" (UniqueName: \"kubernetes.io/projected/743626cc-db6b-4ae5-a8dc-cceaa1cb8be0-kube-api-access-4gxzq\") pod \"dns-default-qcxs6\" (UID: \"743626cc-db6b-4ae5-a8dc-cceaa1cb8be0\") " pod="openshift-dns/dns-default-qcxs6" Apr 24 14:24:02.662211 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.662218 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/84bc3c16-3cf7-443f-887d-67eaa7bb7631-hub\") pod \"cluster-proxy-proxy-agent-79d47f7669-4ff6w\" (UID: \"84bc3c16-3cf7-443f-887d-67eaa7bb7631\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79d47f7669-4ff6w" Apr 24 14:24:02.662461 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.662238 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efd840e5-21ca-4b28-993c-7e8bd0b9a822-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-28tw4\" (UID: \"efd840e5-21ca-4b28-993c-7e8bd0b9a822\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-28tw4" Apr 24 14:24:02.662461 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.662258 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/52687235-a07e-4c1c-83a6-1f59714d998d-tmp\") pod \"klusterlet-addon-workmgr-6cdcd8c5b4-fff9m\" (UID: \"52687235-a07e-4c1c-83a6-1f59714d998d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cdcd8c5b4-fff9m" Apr 24 14:24:02.662461 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.662283 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/743626cc-db6b-4ae5-a8dc-cceaa1cb8be0-config-volume\") pod \"dns-default-qcxs6\" (UID: \"743626cc-db6b-4ae5-a8dc-cceaa1cb8be0\") " pod="openshift-dns/dns-default-qcxs6" Apr 24 14:24:02.662461 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.662306 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/743626cc-db6b-4ae5-a8dc-cceaa1cb8be0-tmp-dir\") pod \"dns-default-qcxs6\" (UID: \"743626cc-db6b-4ae5-a8dc-cceaa1cb8be0\") " pod="openshift-dns/dns-default-qcxs6" Apr 24 14:24:02.662461 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.662332 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d834bc9a-bc43-42cc-82ed-5b3a77d4da5d-cert\") pod \"ingress-canary-95gsb\" (UID: \"d834bc9a-bc43-42cc-82ed-5b3a77d4da5d\") " pod="openshift-ingress-canary/ingress-canary-95gsb" Apr 24 14:24:02.662461 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.662359 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k46h7\" (UniqueName: \"kubernetes.io/projected/52687235-a07e-4c1c-83a6-1f59714d998d-kube-api-access-k46h7\") pod \"klusterlet-addon-workmgr-6cdcd8c5b4-fff9m\" (UID: \"52687235-a07e-4c1c-83a6-1f59714d998d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cdcd8c5b4-fff9m" Apr 24 14:24:02.662461 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.662391 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/447bb5cc-c01c-4312-b625-551aa4e765b3-serving-cert\") pod \"service-ca-operator-d6fc45fc5-qtdqf\" (UID: \"447bb5cc-c01c-4312-b625-551aa4e765b3\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qtdqf" Apr 24 14:24:02.662461 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.662415 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2rvw\" (UniqueName: \"kubernetes.io/projected/84bc3c16-3cf7-443f-887d-67eaa7bb7631-kube-api-access-t2rvw\") pod \"cluster-proxy-proxy-agent-79d47f7669-4ff6w\" (UID: \"84bc3c16-3cf7-443f-887d-67eaa7bb7631\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79d47f7669-4ff6w" Apr 24 14:24:02.662461 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.662439 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-whvfg\" (UniqueName: \"kubernetes.io/projected/efd840e5-21ca-4b28-993c-7e8bd0b9a822-kube-api-access-whvfg\") pod \"kube-storage-version-migrator-operator-6769c5d45-28tw4\" (UID: \"efd840e5-21ca-4b28-993c-7e8bd0b9a822\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-28tw4" Apr 24 14:24:02.662852 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.662480 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/84bc3c16-3cf7-443f-887d-67eaa7bb7631-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-79d47f7669-4ff6w\" (UID: \"84bc3c16-3cf7-443f-887d-67eaa7bb7631\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79d47f7669-4ff6w" Apr 24 14:24:02.662852 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.662509 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g776v\" (UniqueName: \"kubernetes.io/projected/5b2b1cb4-7903-4e07-aa4d-25298b638889-kube-api-access-g776v\") pod \"network-check-source-8894fc9bd-rt2p5\" (UID: \"5b2b1cb4-7903-4e07-aa4d-25298b638889\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rt2p5" Apr 24 14:24:02.662852 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.662537 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdscs\" (UniqueName: \"kubernetes.io/projected/d834bc9a-bc43-42cc-82ed-5b3a77d4da5d-kube-api-access-sdscs\") pod \"ingress-canary-95gsb\" (UID: \"d834bc9a-bc43-42cc-82ed-5b3a77d4da5d\") " pod="openshift-ingress-canary/ingress-canary-95gsb" Apr 24 14:24:02.662852 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.662581 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d63229a9-def3-4d90-95d7-748eed4531eb-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-wsvfh\" (UID: \"d63229a9-def3-4d90-95d7-748eed4531eb\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wsvfh" Apr 24 14:24:02.662852 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.662607 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d63229a9-def3-4d90-95d7-748eed4531eb-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wsvfh\" (UID: \"d63229a9-def3-4d90-95d7-748eed4531eb\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wsvfh" Apr 24 14:24:02.662852 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.662647 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/84bc3c16-3cf7-443f-887d-67eaa7bb7631-ca\") pod \"cluster-proxy-proxy-agent-79d47f7669-4ff6w\" (UID: \"84bc3c16-3cf7-443f-887d-67eaa7bb7631\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79d47f7669-4ff6w" Apr 24 14:24:02.662852 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.662656 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/743626cc-db6b-4ae5-a8dc-cceaa1cb8be0-tmp-dir\") pod \"dns-default-qcxs6\" (UID: \"743626cc-db6b-4ae5-a8dc-cceaa1cb8be0\") " pod="openshift-dns/dns-default-qcxs6" Apr 24 14:24:02.662852 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.662671 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fee20684-697d-4937-9bf8-9549ab5442bf-serving-cert\") pod \"console-operator-9d4b6777b-7979n\" (UID: \"fee20684-697d-4937-9bf8-9549ab5442bf\") " pod="openshift-console-operator/console-operator-9d4b6777b-7979n" Apr 24 14:24:02.662852 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.662713 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efd840e5-21ca-4b28-993c-7e8bd0b9a822-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-28tw4\" (UID: \"efd840e5-21ca-4b28-993c-7e8bd0b9a822\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-28tw4" Apr 24 14:24:02.662852 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.662746 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/743626cc-db6b-4ae5-a8dc-cceaa1cb8be0-metrics-tls\") pod \"dns-default-qcxs6\" (UID: \"743626cc-db6b-4ae5-a8dc-cceaa1cb8be0\") " pod="openshift-dns/dns-default-qcxs6" Apr 24 14:24:02.662852 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.662778 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/52687235-a07e-4c1c-83a6-1f59714d998d-klusterlet-config\") pod \"klusterlet-addon-workmgr-6cdcd8c5b4-fff9m\" (UID: \"52687235-a07e-4c1c-83a6-1f59714d998d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cdcd8c5b4-fff9m" Apr 24 14:24:02.662852 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.662851 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-278jh\" (UniqueName: \"kubernetes.io/projected/fee20684-697d-4937-9bf8-9549ab5442bf-kube-api-access-278jh\") pod \"console-operator-9d4b6777b-7979n\" (UID: \"fee20684-697d-4937-9bf8-9549ab5442bf\") " pod="openshift-console-operator/console-operator-9d4b6777b-7979n" Apr 24 14:24:02.663434 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.662876 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/447bb5cc-c01c-4312-b625-551aa4e765b3-config\") pod \"service-ca-operator-d6fc45fc5-qtdqf\" (UID: \"447bb5cc-c01c-4312-b625-551aa4e765b3\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qtdqf" Apr 24 14:24:02.663434 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.662904 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fee20684-697d-4937-9bf8-9549ab5442bf-config\") pod \"console-operator-9d4b6777b-7979n\" (UID: \"fee20684-697d-4937-9bf8-9549ab5442bf\") " pod="openshift-console-operator/console-operator-9d4b6777b-7979n" Apr 24 14:24:02.663434 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.662923 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/743626cc-db6b-4ae5-a8dc-cceaa1cb8be0-config-volume\") pod \"dns-default-qcxs6\" (UID: \"743626cc-db6b-4ae5-a8dc-cceaa1cb8be0\") " pod="openshift-dns/dns-default-qcxs6" Apr 24 14:24:02.663434 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.662981 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fee20684-697d-4937-9bf8-9549ab5442bf-trusted-ca\") pod \"console-operator-9d4b6777b-7979n\" (UID: \"fee20684-697d-4937-9bf8-9549ab5442bf\") " pod="openshift-console-operator/console-operator-9d4b6777b-7979n" Apr 24 14:24:02.663434 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.663013 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q5jj7\" (UniqueName: \"kubernetes.io/projected/447bb5cc-c01c-4312-b625-551aa4e765b3-kube-api-access-q5jj7\") pod \"service-ca-operator-d6fc45fc5-qtdqf\" (UID: \"447bb5cc-c01c-4312-b625-551aa4e765b3\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qtdqf" Apr 24 14:24:02.663434 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.663043 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3a2bcc78-148a-4024-85bf-f0bfdf9c6f93-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6dd6f99597-fnwrd\" (UID: \"3a2bcc78-148a-4024-85bf-f0bfdf9c6f93\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6dd6f99597-fnwrd" Apr 24 14:24:02.663434 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.663074 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/84bc3c16-3cf7-443f-887d-67eaa7bb7631-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-79d47f7669-4ff6w\" (UID: \"84bc3c16-3cf7-443f-887d-67eaa7bb7631\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79d47f7669-4ff6w" Apr 24 14:24:02.663434 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.663106 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wq4m8\" (UniqueName: \"kubernetes.io/projected/3a2bcc78-148a-4024-85bf-f0bfdf9c6f93-kube-api-access-wq4m8\") pod \"managed-serviceaccount-addon-agent-6dd6f99597-fnwrd\" (UID: \"3a2bcc78-148a-4024-85bf-f0bfdf9c6f93\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6dd6f99597-fnwrd" Apr 24 14:24:02.663434 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.663137 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/84bc3c16-3cf7-443f-887d-67eaa7bb7631-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-79d47f7669-4ff6w\" (UID: \"84bc3c16-3cf7-443f-887d-67eaa7bb7631\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79d47f7669-4ff6w" Apr 24 14:24:02.663434 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.663309 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d63229a9-def3-4d90-95d7-748eed4531eb-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-wsvfh\" (UID: \"d63229a9-def3-4d90-95d7-748eed4531eb\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wsvfh" Apr 24 14:24:02.663928 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.663590 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fee20684-697d-4937-9bf8-9549ab5442bf-config\") pod \"console-operator-9d4b6777b-7979n\" (UID: \"fee20684-697d-4937-9bf8-9549ab5442bf\") " pod="openshift-console-operator/console-operator-9d4b6777b-7979n" Apr 24 14:24:02.663928 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.663611 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/447bb5cc-c01c-4312-b625-551aa4e765b3-config\") pod \"service-ca-operator-d6fc45fc5-qtdqf\" (UID: \"447bb5cc-c01c-4312-b625-551aa4e765b3\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qtdqf" Apr 24 14:24:02.663928 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:02.663668 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:24:02.663928 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:02.663800 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 14:24:02.663928 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.663839 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efd840e5-21ca-4b28-993c-7e8bd0b9a822-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-28tw4\" (UID: \"efd840e5-21ca-4b28-993c-7e8bd0b9a822\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-28tw4" Apr 24 14:24:02.663928 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:02.663859 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/743626cc-db6b-4ae5-a8dc-cceaa1cb8be0-metrics-tls podName:743626cc-db6b-4ae5-a8dc-cceaa1cb8be0 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:03.163837068 +0000 UTC m=+33.753989785 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/743626cc-db6b-4ae5-a8dc-cceaa1cb8be0-metrics-tls") pod "dns-default-qcxs6" (UID: "743626cc-db6b-4ae5-a8dc-cceaa1cb8be0") : secret "dns-default-metrics-tls" not found Apr 24 14:24:02.663928 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:02.663880 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d63229a9-def3-4d90-95d7-748eed4531eb-networking-console-plugin-cert podName:d63229a9-def3-4d90-95d7-748eed4531eb nodeName:}" failed. No retries permitted until 2026-04-24 14:24:03.163869544 +0000 UTC m=+33.754022247 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d63229a9-def3-4d90-95d7-748eed4531eb-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-wsvfh" (UID: "d63229a9-def3-4d90-95d7-748eed4531eb") : secret "networking-console-plugin-cert" not found Apr 24 14:24:02.664459 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.664438 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fee20684-697d-4937-9bf8-9549ab5442bf-trusted-ca\") pod \"console-operator-9d4b6777b-7979n\" (UID: \"fee20684-697d-4937-9bf8-9549ab5442bf\") " pod="openshift-console-operator/console-operator-9d4b6777b-7979n" Apr 24 14:24:02.665340 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.665316 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/447bb5cc-c01c-4312-b625-551aa4e765b3-serving-cert\") pod \"service-ca-operator-d6fc45fc5-qtdqf\" (UID: \"447bb5cc-c01c-4312-b625-551aa4e765b3\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qtdqf" Apr 24 14:24:02.665442 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.665408 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efd840e5-21ca-4b28-993c-7e8bd0b9a822-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-28tw4\" (UID: \"efd840e5-21ca-4b28-993c-7e8bd0b9a822\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-28tw4" Apr 24 14:24:02.665715 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.665690 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fee20684-697d-4937-9bf8-9549ab5442bf-serving-cert\") pod \"console-operator-9d4b6777b-7979n\" (UID: \"fee20684-697d-4937-9bf8-9549ab5442bf\") " pod="openshift-console-operator/console-operator-9d4b6777b-7979n" Apr 24 14:24:02.666222 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.666200 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3a2bcc78-148a-4024-85bf-f0bfdf9c6f93-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6dd6f99597-fnwrd\" (UID: \"3a2bcc78-148a-4024-85bf-f0bfdf9c6f93\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6dd6f99597-fnwrd" Apr 24 14:24:02.669782 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.669762 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-278jh\" (UniqueName: \"kubernetes.io/projected/fee20684-697d-4937-9bf8-9549ab5442bf-kube-api-access-278jh\") pod \"console-operator-9d4b6777b-7979n\" (UID: \"fee20684-697d-4937-9bf8-9549ab5442bf\") " pod="openshift-console-operator/console-operator-9d4b6777b-7979n" Apr 24 14:24:02.674862 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.674330 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5jj7\" (UniqueName: \"kubernetes.io/projected/447bb5cc-c01c-4312-b625-551aa4e765b3-kube-api-access-q5jj7\") pod \"service-ca-operator-d6fc45fc5-qtdqf\" (UID: \"447bb5cc-c01c-4312-b625-551aa4e765b3\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qtdqf" Apr 24 14:24:02.674862 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.674427 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xw99m" Apr 24 14:24:02.674862 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.674791 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gxzq\" (UniqueName: \"kubernetes.io/projected/743626cc-db6b-4ae5-a8dc-cceaa1cb8be0-kube-api-access-4gxzq\") pod \"dns-default-qcxs6\" (UID: \"743626cc-db6b-4ae5-a8dc-cceaa1cb8be0\") " pod="openshift-dns/dns-default-qcxs6" Apr 24 14:24:02.675130 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.675065 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g776v\" (UniqueName: \"kubernetes.io/projected/5b2b1cb4-7903-4e07-aa4d-25298b638889-kube-api-access-g776v\") pod \"network-check-source-8894fc9bd-rt2p5\" (UID: \"5b2b1cb4-7903-4e07-aa4d-25298b638889\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rt2p5" Apr 24 14:24:02.676252 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.676232 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wq4m8\" (UniqueName: \"kubernetes.io/projected/3a2bcc78-148a-4024-85bf-f0bfdf9c6f93-kube-api-access-wq4m8\") pod \"managed-serviceaccount-addon-agent-6dd6f99597-fnwrd\" (UID: \"3a2bcc78-148a-4024-85bf-f0bfdf9c6f93\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6dd6f99597-fnwrd" Apr 24 14:24:02.676404 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.676384 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-whvfg\" (UniqueName: \"kubernetes.io/projected/efd840e5-21ca-4b28-993c-7e8bd0b9a822-kube-api-access-whvfg\") pod \"kube-storage-version-migrator-operator-6769c5d45-28tw4\" (UID: \"efd840e5-21ca-4b28-993c-7e8bd0b9a822\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-28tw4" Apr 24 14:24:02.707621 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.707590 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-7979n" Apr 24 14:24:02.731491 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.731462 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-28tw4" Apr 24 14:24:02.746929 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.746894 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qtdqf" Apr 24 14:24:02.757606 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.757580 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rt2p5" Apr 24 14:24:02.764508 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.764485 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/84bc3c16-3cf7-443f-887d-67eaa7bb7631-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-79d47f7669-4ff6w\" (UID: \"84bc3c16-3cf7-443f-887d-67eaa7bb7631\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79d47f7669-4ff6w" Apr 24 14:24:02.764623 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.764522 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/84bc3c16-3cf7-443f-887d-67eaa7bb7631-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-79d47f7669-4ff6w\" (UID: \"84bc3c16-3cf7-443f-887d-67eaa7bb7631\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79d47f7669-4ff6w" Apr 24 14:24:02.764623 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.764563 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/84bc3c16-3cf7-443f-887d-67eaa7bb7631-hub\") pod \"cluster-proxy-proxy-agent-79d47f7669-4ff6w\" (UID: \"84bc3c16-3cf7-443f-887d-67eaa7bb7631\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79d47f7669-4ff6w" Apr 24 14:24:02.764623 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.764584 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/52687235-a07e-4c1c-83a6-1f59714d998d-tmp\") pod \"klusterlet-addon-workmgr-6cdcd8c5b4-fff9m\" (UID: \"52687235-a07e-4c1c-83a6-1f59714d998d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cdcd8c5b4-fff9m" Apr 24 14:24:02.764623 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.764612 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d834bc9a-bc43-42cc-82ed-5b3a77d4da5d-cert\") pod \"ingress-canary-95gsb\" (UID: \"d834bc9a-bc43-42cc-82ed-5b3a77d4da5d\") " pod="openshift-ingress-canary/ingress-canary-95gsb" Apr 24 14:24:02.764852 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.764640 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k46h7\" (UniqueName: \"kubernetes.io/projected/52687235-a07e-4c1c-83a6-1f59714d998d-kube-api-access-k46h7\") pod \"klusterlet-addon-workmgr-6cdcd8c5b4-fff9m\" (UID: \"52687235-a07e-4c1c-83a6-1f59714d998d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cdcd8c5b4-fff9m" Apr 24 14:24:02.764852 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.764672 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t2rvw\" (UniqueName: \"kubernetes.io/projected/84bc3c16-3cf7-443f-887d-67eaa7bb7631-kube-api-access-t2rvw\") pod \"cluster-proxy-proxy-agent-79d47f7669-4ff6w\" (UID: \"84bc3c16-3cf7-443f-887d-67eaa7bb7631\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79d47f7669-4ff6w" Apr 24 14:24:02.764852 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.764705 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/84bc3c16-3cf7-443f-887d-67eaa7bb7631-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-79d47f7669-4ff6w\" (UID: \"84bc3c16-3cf7-443f-887d-67eaa7bb7631\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79d47f7669-4ff6w" Apr 24 14:24:02.764852 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.764736 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sdscs\" (UniqueName: \"kubernetes.io/projected/d834bc9a-bc43-42cc-82ed-5b3a77d4da5d-kube-api-access-sdscs\") pod \"ingress-canary-95gsb\" (UID: \"d834bc9a-bc43-42cc-82ed-5b3a77d4da5d\") " pod="openshift-ingress-canary/ingress-canary-95gsb" Apr 24 14:24:02.764852 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.764797 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/84bc3c16-3cf7-443f-887d-67eaa7bb7631-ca\") pod \"cluster-proxy-proxy-agent-79d47f7669-4ff6w\" (UID: \"84bc3c16-3cf7-443f-887d-67eaa7bb7631\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79d47f7669-4ff6w" Apr 24 14:24:02.765057 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:02.764913 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:24:02.765057 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:02.764998 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d834bc9a-bc43-42cc-82ed-5b3a77d4da5d-cert podName:d834bc9a-bc43-42cc-82ed-5b3a77d4da5d nodeName:}" failed. No retries permitted until 2026-04-24 14:24:03.264976304 +0000 UTC m=+33.855129011 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d834bc9a-bc43-42cc-82ed-5b3a77d4da5d-cert") pod "ingress-canary-95gsb" (UID: "d834bc9a-bc43-42cc-82ed-5b3a77d4da5d") : secret "canary-serving-cert" not found Apr 24 14:24:02.765162 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.765145 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/52687235-a07e-4c1c-83a6-1f59714d998d-klusterlet-config\") pod \"klusterlet-addon-workmgr-6cdcd8c5b4-fff9m\" (UID: \"52687235-a07e-4c1c-83a6-1f59714d998d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cdcd8c5b4-fff9m" Apr 24 14:24:02.765527 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.765462 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/52687235-a07e-4c1c-83a6-1f59714d998d-tmp\") pod \"klusterlet-addon-workmgr-6cdcd8c5b4-fff9m\" (UID: \"52687235-a07e-4c1c-83a6-1f59714d998d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cdcd8c5b4-fff9m" Apr 24 14:24:02.765653 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.765630 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/84bc3c16-3cf7-443f-887d-67eaa7bb7631-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-79d47f7669-4ff6w\" (UID: \"84bc3c16-3cf7-443f-887d-67eaa7bb7631\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79d47f7669-4ff6w" Apr 24 14:24:02.768109 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.768088 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/84bc3c16-3cf7-443f-887d-67eaa7bb7631-ca\") pod \"cluster-proxy-proxy-agent-79d47f7669-4ff6w\" (UID: \"84bc3c16-3cf7-443f-887d-67eaa7bb7631\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79d47f7669-4ff6w" Apr 24 14:24:02.768205 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.768098 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/84bc3c16-3cf7-443f-887d-67eaa7bb7631-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-79d47f7669-4ff6w\" (UID: \"84bc3c16-3cf7-443f-887d-67eaa7bb7631\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79d47f7669-4ff6w" Apr 24 14:24:02.768205 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.768160 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/52687235-a07e-4c1c-83a6-1f59714d998d-klusterlet-config\") pod \"klusterlet-addon-workmgr-6cdcd8c5b4-fff9m\" (UID: \"52687235-a07e-4c1c-83a6-1f59714d998d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cdcd8c5b4-fff9m" Apr 24 14:24:02.768205 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.768158 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/84bc3c16-3cf7-443f-887d-67eaa7bb7631-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-79d47f7669-4ff6w\" (UID: \"84bc3c16-3cf7-443f-887d-67eaa7bb7631\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79d47f7669-4ff6w" Apr 24 14:24:02.768205 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.768191 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/84bc3c16-3cf7-443f-887d-67eaa7bb7631-hub\") pod \"cluster-proxy-proxy-agent-79d47f7669-4ff6w\" (UID: \"84bc3c16-3cf7-443f-887d-67eaa7bb7631\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79d47f7669-4ff6w" Apr 24 14:24:02.773506 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.773481 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2rvw\" (UniqueName: \"kubernetes.io/projected/84bc3c16-3cf7-443f-887d-67eaa7bb7631-kube-api-access-t2rvw\") pod \"cluster-proxy-proxy-agent-79d47f7669-4ff6w\" (UID: \"84bc3c16-3cf7-443f-887d-67eaa7bb7631\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79d47f7669-4ff6w" Apr 24 14:24:02.773654 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.773633 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k46h7\" (UniqueName: \"kubernetes.io/projected/52687235-a07e-4c1c-83a6-1f59714d998d-kube-api-access-k46h7\") pod \"klusterlet-addon-workmgr-6cdcd8c5b4-fff9m\" (UID: \"52687235-a07e-4c1c-83a6-1f59714d998d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cdcd8c5b4-fff9m" Apr 24 14:24:02.773726 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.773717 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6dd6f99597-fnwrd" Apr 24 14:24:02.774239 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.774219 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdscs\" (UniqueName: \"kubernetes.io/projected/d834bc9a-bc43-42cc-82ed-5b3a77d4da5d-kube-api-access-sdscs\") pod \"ingress-canary-95gsb\" (UID: \"d834bc9a-bc43-42cc-82ed-5b3a77d4da5d\") " pod="openshift-ingress-canary/ingress-canary-95gsb" Apr 24 14:24:02.803798 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.803726 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cdcd8c5b4-fff9m" Apr 24 14:24:02.821418 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:02.821397 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79d47f7669-4ff6w" Apr 24 14:24:03.072746 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:03.071708 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cbc7c460-1a4a-47be-8d93-efd98ee46239-registry-tls\") pod \"image-registry-55d6cb6cf9-tqwbb\" (UID: \"cbc7c460-1a4a-47be-8d93-efd98ee46239\") " pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" Apr 24 14:24:03.072746 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:03.071790 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/55813995-c655-4039-9151-23a5a7023a30-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lq2ln\" (UID: \"55813995-c655-4039-9151-23a5a7023a30\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lq2ln" Apr 24 14:24:03.072746 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:03.071878 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d7576e0a-caf3-4817-8250-2b6570598ac0-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-l59rf\" (UID: \"d7576e0a-caf3-4817-8250-2b6570598ac0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l59rf" Apr 24 14:24:03.072746 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:03.072058 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 14:24:03.072746 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:03.072127 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7576e0a-caf3-4817-8250-2b6570598ac0-cluster-monitoring-operator-tls podName:d7576e0a-caf3-4817-8250-2b6570598ac0 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:04.072107629 +0000 UTC m=+34.662260334 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d7576e0a-caf3-4817-8250-2b6570598ac0-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-l59rf" (UID: "d7576e0a-caf3-4817-8250-2b6570598ac0") : secret "cluster-monitoring-operator-tls" not found Apr 24 14:24:03.072746 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:03.072204 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 14:24:03.072746 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:03.072216 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55d6cb6cf9-tqwbb: secret "image-registry-tls" not found Apr 24 14:24:03.072746 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:03.072252 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cbc7c460-1a4a-47be-8d93-efd98ee46239-registry-tls podName:cbc7c460-1a4a-47be-8d93-efd98ee46239 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:04.072239953 +0000 UTC m=+34.662392655 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cbc7c460-1a4a-47be-8d93-efd98ee46239-registry-tls") pod "image-registry-55d6cb6cf9-tqwbb" (UID: "cbc7c460-1a4a-47be-8d93-efd98ee46239") : secret "image-registry-tls" not found Apr 24 14:24:03.072746 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:03.072305 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 14:24:03.072746 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:03.072340 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55813995-c655-4039-9151-23a5a7023a30-samples-operator-tls podName:55813995-c655-4039-9151-23a5a7023a30 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:04.072329915 +0000 UTC m=+34.662482617 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/55813995-c655-4039-9151-23a5a7023a30-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-lq2ln" (UID: "55813995-c655-4039-9151-23a5a7023a30") : secret "samples-operator-tls" not found Apr 24 14:24:03.131552 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:03.131501 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6dd6f99597-fnwrd"] Apr 24 14:24:03.138753 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:24:03.138399 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a2bcc78_148a_4024_85bf_f0bfdf9c6f93.slice/crio-bfd3c31c0e0eb12f67941d4a1c25b0dbb37402273ac2a840d6bbefadf139fba9 WatchSource:0}: Error finding container bfd3c31c0e0eb12f67941d4a1c25b0dbb37402273ac2a840d6bbefadf139fba9: Status 404 returned error can't find the container with id bfd3c31c0e0eb12f67941d4a1c25b0dbb37402273ac2a840d6bbefadf139fba9 Apr 24 14:24:03.140761 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:03.140538 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79d47f7669-4ff6w"] Apr 24 14:24:03.142767 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:24:03.142718 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84bc3c16_3cf7_443f_887d_67eaa7bb7631.slice/crio-da2f3c86cabd150eda6ba04915e26a25d476d25e397a52ded06e8d10dd036a80 WatchSource:0}: Error finding container da2f3c86cabd150eda6ba04915e26a25d476d25e397a52ded06e8d10dd036a80: Status 404 returned error can't find the container with id da2f3c86cabd150eda6ba04915e26a25d476d25e397a52ded06e8d10dd036a80 Apr 24 14:24:03.147900 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:03.147768 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xw99m"] Apr 24 14:24:03.153060 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:24:03.153031 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f90d0c4_7a9d_46c4_bb3a_d2d928bb1412.slice/crio-23036dfd9bd51cd02258143ab17b7014e232c27db48f92405b5cc76d6e814dc5 WatchSource:0}: Error finding container 23036dfd9bd51cd02258143ab17b7014e232c27db48f92405b5cc76d6e814dc5: Status 404 returned error can't find the container with id 23036dfd9bd51cd02258143ab17b7014e232c27db48f92405b5cc76d6e814dc5 Apr 24 14:24:03.164314 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:03.164271 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cdcd8c5b4-fff9m"] Apr 24 14:24:03.168388 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:03.168359 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-7979n"] Apr 24 14:24:03.170759 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:24:03.170725 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52687235_a07e_4c1c_83a6_1f59714d998d.slice/crio-a7b4d5ad65c9010fe873d0164e3bb22267306c6b11ca3bceb530588c8d5c41cc WatchSource:0}: Error finding container a7b4d5ad65c9010fe873d0164e3bb22267306c6b11ca3bceb530588c8d5c41cc: Status 404 returned error can't find the container with id a7b4d5ad65c9010fe873d0164e3bb22267306c6b11ca3bceb530588c8d5c41cc Apr 24 14:24:03.171853 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:24:03.171829 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfee20684_697d_4937_9bf8_9549ab5442bf.slice/crio-0105c005e9f51ba7915bffc71b46cfe0c3b38db2d2627bb033eddd45859794bf WatchSource:0}: Error finding container 0105c005e9f51ba7915bffc71b46cfe0c3b38db2d2627bb033eddd45859794bf: Status 404 returned error can't find the container with id 0105c005e9f51ba7915bffc71b46cfe0c3b38db2d2627bb033eddd45859794bf Apr 24 14:24:03.172696 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:03.172449 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d63229a9-def3-4d90-95d7-748eed4531eb-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wsvfh\" (UID: \"d63229a9-def3-4d90-95d7-748eed4531eb\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wsvfh" Apr 24 14:24:03.172696 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:03.172496 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/743626cc-db6b-4ae5-a8dc-cceaa1cb8be0-metrics-tls\") pod \"dns-default-qcxs6\" (UID: \"743626cc-db6b-4ae5-a8dc-cceaa1cb8be0\") " pod="openshift-dns/dns-default-qcxs6" Apr 24 14:24:03.172783 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:03.172706 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:24:03.172783 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:03.172758 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/743626cc-db6b-4ae5-a8dc-cceaa1cb8be0-metrics-tls podName:743626cc-db6b-4ae5-a8dc-cceaa1cb8be0 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:04.172740282 +0000 UTC m=+34.762892982 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/743626cc-db6b-4ae5-a8dc-cceaa1cb8be0-metrics-tls") pod "dns-default-qcxs6" (UID: "743626cc-db6b-4ae5-a8dc-cceaa1cb8be0") : secret "dns-default-metrics-tls" not found Apr 24 14:24:03.173159 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:03.173139 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 14:24:03.173226 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:03.173209 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d63229a9-def3-4d90-95d7-748eed4531eb-networking-console-plugin-cert podName:d63229a9-def3-4d90-95d7-748eed4531eb nodeName:}" failed. No retries permitted until 2026-04-24 14:24:04.173192723 +0000 UTC m=+34.763345447 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d63229a9-def3-4d90-95d7-748eed4531eb-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-wsvfh" (UID: "d63229a9-def3-4d90-95d7-748eed4531eb") : secret "networking-console-plugin-cert" not found Apr 24 14:24:03.174384 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:03.174270 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qtdqf"] Apr 24 14:24:03.179745 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:03.179726 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-28tw4"] Apr 24 14:24:03.182727 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:24:03.182706 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod447bb5cc_c01c_4312_b625_551aa4e765b3.slice/crio-b6d254062e8b7590fc5aa5d14f3bc9d728c06dae7073de6d67809ac50f706087 WatchSource:0}: Error finding container b6d254062e8b7590fc5aa5d14f3bc9d728c06dae7073de6d67809ac50f706087: Status 404 returned error can't find the container with id b6d254062e8b7590fc5aa5d14f3bc9d728c06dae7073de6d67809ac50f706087 Apr 24 14:24:03.183494 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:24:03.183467 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefd840e5_21ca_4b28_993c_7e8bd0b9a822.slice/crio-963c8a0dd61a884053bae7a873ebb684bf3b77469ab0ce1a0d0efebfa2bb31a5 WatchSource:0}: Error finding container 963c8a0dd61a884053bae7a873ebb684bf3b77469ab0ce1a0d0efebfa2bb31a5: Status 404 returned error can't find the container with id 963c8a0dd61a884053bae7a873ebb684bf3b77469ab0ce1a0d0efebfa2bb31a5 Apr 24 14:24:03.186212 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:03.186187 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-rt2p5"] Apr 24 14:24:03.194840 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:24:03.194787 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b2b1cb4_7903_4e07_aa4d_25298b638889.slice/crio-095952c709dfc3f42a5fea8c2c33d4a969452c4dac70d8f48a76b74656d5d6fc WatchSource:0}: Error finding container 095952c709dfc3f42a5fea8c2c33d4a969452c4dac70d8f48a76b74656d5d6fc: Status 404 returned error can't find the container with id 095952c709dfc3f42a5fea8c2c33d4a969452c4dac70d8f48a76b74656d5d6fc Apr 24 14:24:03.197286 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:03.197254 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cdcd8c5b4-fff9m" event={"ID":"52687235-a07e-4c1c-83a6-1f59714d998d","Type":"ContainerStarted","Data":"a7b4d5ad65c9010fe873d0164e3bb22267306c6b11ca3bceb530588c8d5c41cc"} Apr 24 14:24:03.198234 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:03.198208 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6dd6f99597-fnwrd" event={"ID":"3a2bcc78-148a-4024-85bf-f0bfdf9c6f93","Type":"ContainerStarted","Data":"bfd3c31c0e0eb12f67941d4a1c25b0dbb37402273ac2a840d6bbefadf139fba9"} Apr 24 14:24:03.199240 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:03.199222 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79d47f7669-4ff6w" event={"ID":"84bc3c16-3cf7-443f-887d-67eaa7bb7631","Type":"ContainerStarted","Data":"da2f3c86cabd150eda6ba04915e26a25d476d25e397a52ded06e8d10dd036a80"} Apr 24 14:24:03.200230 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:03.200210 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qtdqf" event={"ID":"447bb5cc-c01c-4312-b625-551aa4e765b3","Type":"ContainerStarted","Data":"b6d254062e8b7590fc5aa5d14f3bc9d728c06dae7073de6d67809ac50f706087"} Apr 24 14:24:03.201030 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:03.201012 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xw99m" event={"ID":"0f90d0c4-7a9d-46c4-bb3a-d2d928bb1412","Type":"ContainerStarted","Data":"23036dfd9bd51cd02258143ab17b7014e232c27db48f92405b5cc76d6e814dc5"} Apr 24 14:24:03.201970 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:03.201949 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-7979n" event={"ID":"fee20684-697d-4937-9bf8-9549ab5442bf","Type":"ContainerStarted","Data":"0105c005e9f51ba7915bffc71b46cfe0c3b38db2d2627bb033eddd45859794bf"} Apr 24 14:24:03.202849 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:03.202832 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-28tw4" event={"ID":"efd840e5-21ca-4b28-993c-7e8bd0b9a822","Type":"ContainerStarted","Data":"963c8a0dd61a884053bae7a873ebb684bf3b77469ab0ce1a0d0efebfa2bb31a5"} Apr 24 14:24:03.273982 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:03.273883 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d834bc9a-bc43-42cc-82ed-5b3a77d4da5d-cert\") pod \"ingress-canary-95gsb\" (UID: \"d834bc9a-bc43-42cc-82ed-5b3a77d4da5d\") " pod="openshift-ingress-canary/ingress-canary-95gsb" Apr 24 14:24:03.273982 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:03.273980 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:24:03.274483 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:03.274050 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d834bc9a-bc43-42cc-82ed-5b3a77d4da5d-cert podName:d834bc9a-bc43-42cc-82ed-5b3a77d4da5d nodeName:}" failed. No retries permitted until 2026-04-24 14:24:04.274027529 +0000 UTC m=+34.864180234 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d834bc9a-bc43-42cc-82ed-5b3a77d4da5d-cert") pod "ingress-canary-95gsb" (UID: "d834bc9a-bc43-42cc-82ed-5b3a77d4da5d") : secret "canary-serving-cert" not found Apr 24 14:24:03.676961 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:03.676924 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/952d5757-28bc-4940-9fa6-4a50ffff6476-metrics-certs\") pod \"network-metrics-daemon-ct9nz\" (UID: \"952d5757-28bc-4940-9fa6-4a50ffff6476\") " pod="openshift-multus/network-metrics-daemon-ct9nz" Apr 24 14:24:03.677168 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:03.677075 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:03.677168 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:03.677159 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/952d5757-28bc-4940-9fa6-4a50ffff6476-metrics-certs podName:952d5757-28bc-4940-9fa6-4a50ffff6476 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:35.677127384 +0000 UTC m=+66.267280084 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/952d5757-28bc-4940-9fa6-4a50ffff6476-metrics-certs") pod "network-metrics-daemon-ct9nz" (UID: "952d5757-28bc-4940-9fa6-4a50ffff6476") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:03.778047 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:03.778005 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzql6\" (UniqueName: \"kubernetes.io/projected/f33ebad9-63f4-4a25-865f-68c02ee70c85-kube-api-access-tzql6\") pod \"network-check-target-57rkt\" (UID: \"f33ebad9-63f4-4a25-865f-68c02ee70c85\") " pod="openshift-network-diagnostics/network-check-target-57rkt" Apr 24 14:24:03.789711 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:03.789682 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzql6\" (UniqueName: \"kubernetes.io/projected/f33ebad9-63f4-4a25-865f-68c02ee70c85-kube-api-access-tzql6\") pod \"network-check-target-57rkt\" (UID: \"f33ebad9-63f4-4a25-865f-68c02ee70c85\") " pod="openshift-network-diagnostics/network-check-target-57rkt" Apr 24 14:24:04.021370 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:04.019778 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ct9nz" Apr 24 14:24:04.021370 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:04.020681 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6gs2p" Apr 24 14:24:04.021370 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:04.021101 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57rkt" Apr 24 14:24:04.023982 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:04.022943 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 14:24:04.023982 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:04.023370 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-p2dnd\"" Apr 24 14:24:04.023982 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:04.023655 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 14:24:04.023982 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:04.023901 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-qmh6n\"" Apr 24 14:24:04.057678 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:04.057639 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57rkt" Apr 24 14:24:04.086872 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:04.085800 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cbc7c460-1a4a-47be-8d93-efd98ee46239-registry-tls\") pod \"image-registry-55d6cb6cf9-tqwbb\" (UID: \"cbc7c460-1a4a-47be-8d93-efd98ee46239\") " pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" Apr 24 14:24:04.086872 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:04.085895 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/55813995-c655-4039-9151-23a5a7023a30-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lq2ln\" (UID: \"55813995-c655-4039-9151-23a5a7023a30\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lq2ln" Apr 24 14:24:04.086872 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:04.085945 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d7576e0a-caf3-4817-8250-2b6570598ac0-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-l59rf\" (UID: \"d7576e0a-caf3-4817-8250-2b6570598ac0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l59rf" Apr 24 14:24:04.086872 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:04.086157 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 14:24:04.086872 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:04.086224 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7576e0a-caf3-4817-8250-2b6570598ac0-cluster-monitoring-operator-tls podName:d7576e0a-caf3-4817-8250-2b6570598ac0 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:06.08620576 +0000 UTC m=+36.676358467 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d7576e0a-caf3-4817-8250-2b6570598ac0-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-l59rf" (UID: "d7576e0a-caf3-4817-8250-2b6570598ac0") : secret "cluster-monitoring-operator-tls" not found Apr 24 14:24:04.086872 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:04.086663 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 14:24:04.086872 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:04.086677 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55d6cb6cf9-tqwbb: secret "image-registry-tls" not found Apr 24 14:24:04.086872 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:04.086720 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cbc7c460-1a4a-47be-8d93-efd98ee46239-registry-tls podName:cbc7c460-1a4a-47be-8d93-efd98ee46239 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:06.086705725 +0000 UTC m=+36.676858427 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cbc7c460-1a4a-47be-8d93-efd98ee46239-registry-tls") pod "image-registry-55d6cb6cf9-tqwbb" (UID: "cbc7c460-1a4a-47be-8d93-efd98ee46239") : secret "image-registry-tls" not found Apr 24 14:24:04.086872 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:04.086783 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 14:24:04.086872 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:04.086837 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55813995-c655-4039-9151-23a5a7023a30-samples-operator-tls podName:55813995-c655-4039-9151-23a5a7023a30 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:06.08682525 +0000 UTC m=+36.676977951 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/55813995-c655-4039-9151-23a5a7023a30-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-lq2ln" (UID: "55813995-c655-4039-9151-23a5a7023a30") : secret "samples-operator-tls" not found Apr 24 14:24:04.188694 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:04.187450 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d63229a9-def3-4d90-95d7-748eed4531eb-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wsvfh\" (UID: \"d63229a9-def3-4d90-95d7-748eed4531eb\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wsvfh" Apr 24 14:24:04.188694 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:04.187524 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/743626cc-db6b-4ae5-a8dc-cceaa1cb8be0-metrics-tls\") pod \"dns-default-qcxs6\" (UID: \"743626cc-db6b-4ae5-a8dc-cceaa1cb8be0\") " pod="openshift-dns/dns-default-qcxs6" Apr 24 14:24:04.188694 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:04.187657 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:24:04.188694 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:04.187705 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/743626cc-db6b-4ae5-a8dc-cceaa1cb8be0-metrics-tls podName:743626cc-db6b-4ae5-a8dc-cceaa1cb8be0 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:06.187691173 +0000 UTC m=+36.777843874 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/743626cc-db6b-4ae5-a8dc-cceaa1cb8be0-metrics-tls") pod "dns-default-qcxs6" (UID: "743626cc-db6b-4ae5-a8dc-cceaa1cb8be0") : secret "dns-default-metrics-tls" not found Apr 24 14:24:04.188694 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:04.188343 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 14:24:04.188694 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:04.188392 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d63229a9-def3-4d90-95d7-748eed4531eb-networking-console-plugin-cert podName:d63229a9-def3-4d90-95d7-748eed4531eb nodeName:}" failed. No retries permitted until 2026-04-24 14:24:06.1883765 +0000 UTC m=+36.778529205 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d63229a9-def3-4d90-95d7-748eed4531eb-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-wsvfh" (UID: "d63229a9-def3-4d90-95d7-748eed4531eb") : secret "networking-console-plugin-cert" not found Apr 24 14:24:04.211524 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:04.211462 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rt2p5" event={"ID":"5b2b1cb4-7903-4e07-aa4d-25298b638889","Type":"ContainerStarted","Data":"095952c709dfc3f42a5fea8c2c33d4a969452c4dac70d8f48a76b74656d5d6fc"} Apr 24 14:24:04.218709 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:04.217765 2567 generic.go:358] "Generic (PLEG): container finished" podID="04393dd2-c684-4592-bc88-2223fac95a11" containerID="e582d5d81943e39c58b04d07b374cbc5d3c8db34b5ab725a2f0907e90c80458a" exitCode=0 Apr 24 14:24:04.218709 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:04.217829 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cb58k" event={"ID":"04393dd2-c684-4592-bc88-2223fac95a11","Type":"ContainerDied","Data":"e582d5d81943e39c58b04d07b374cbc5d3c8db34b5ab725a2f0907e90c80458a"} Apr 24 14:24:04.226446 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:04.224902 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-57rkt"] Apr 24 14:24:04.233093 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:24:04.233052 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf33ebad9_63f4_4a25_865f_68c02ee70c85.slice/crio-649a3ef158e272c48662a79a0527019eb9f81d2ff56ba194ebca976cc0ef99bc WatchSource:0}: Error finding container 649a3ef158e272c48662a79a0527019eb9f81d2ff56ba194ebca976cc0ef99bc: Status 404 returned error can't find the container with id 649a3ef158e272c48662a79a0527019eb9f81d2ff56ba194ebca976cc0ef99bc Apr 24 14:24:04.288484 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:04.288405 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d834bc9a-bc43-42cc-82ed-5b3a77d4da5d-cert\") pod \"ingress-canary-95gsb\" (UID: \"d834bc9a-bc43-42cc-82ed-5b3a77d4da5d\") " pod="openshift-ingress-canary/ingress-canary-95gsb" Apr 24 14:24:04.289152 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:04.288467 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9c15bbb5-ef9b-4df0-b792-073931b97ea8-original-pull-secret\") pod \"global-pull-secret-syncer-6gs2p\" (UID: \"9c15bbb5-ef9b-4df0-b792-073931b97ea8\") " pod="kube-system/global-pull-secret-syncer-6gs2p" Apr 24 14:24:04.289292 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:04.289179 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:24:04.289292 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:04.289251 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d834bc9a-bc43-42cc-82ed-5b3a77d4da5d-cert podName:d834bc9a-bc43-42cc-82ed-5b3a77d4da5d nodeName:}" failed. No retries permitted until 2026-04-24 14:24:06.289230553 +0000 UTC m=+36.879383255 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d834bc9a-bc43-42cc-82ed-5b3a77d4da5d-cert") pod "ingress-canary-95gsb" (UID: "d834bc9a-bc43-42cc-82ed-5b3a77d4da5d") : secret "canary-serving-cert" not found Apr 24 14:24:04.301345 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:04.301317 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9c15bbb5-ef9b-4df0-b792-073931b97ea8-original-pull-secret\") pod \"global-pull-secret-syncer-6gs2p\" (UID: \"9c15bbb5-ef9b-4df0-b792-073931b97ea8\") " pod="kube-system/global-pull-secret-syncer-6gs2p" Apr 24 14:24:04.349894 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:04.349498 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6gs2p" Apr 24 14:24:04.533842 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:04.533709 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6gs2p"] Apr 24 14:24:04.540870 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:24:04.540488 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c15bbb5_ef9b_4df0_b792_073931b97ea8.slice/crio-2741fd63e7dfbea47fc16cf00396ff658bed056f96791f604914fb297e2e6083 WatchSource:0}: Error finding container 2741fd63e7dfbea47fc16cf00396ff658bed056f96791f604914fb297e2e6083: Status 404 returned error can't find the container with id 2741fd63e7dfbea47fc16cf00396ff658bed056f96791f604914fb297e2e6083 Apr 24 14:24:05.235470 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:05.235401 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-57rkt" event={"ID":"f33ebad9-63f4-4a25-865f-68c02ee70c85","Type":"ContainerStarted","Data":"649a3ef158e272c48662a79a0527019eb9f81d2ff56ba194ebca976cc0ef99bc"} Apr 24 14:24:05.260651 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:05.260617 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6gs2p" event={"ID":"9c15bbb5-ef9b-4df0-b792-073931b97ea8","Type":"ContainerStarted","Data":"2741fd63e7dfbea47fc16cf00396ff658bed056f96791f604914fb297e2e6083"} Apr 24 14:24:05.298107 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:05.297694 2567 generic.go:358] "Generic (PLEG): container finished" podID="04393dd2-c684-4592-bc88-2223fac95a11" containerID="16ad6fe62e5d51d7a72f37ae75982d8272bf9ae6b0154c0faa0f8fed78531310" exitCode=0 Apr 24 14:24:05.298107 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:05.297762 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cb58k" event={"ID":"04393dd2-c684-4592-bc88-2223fac95a11","Type":"ContainerDied","Data":"16ad6fe62e5d51d7a72f37ae75982d8272bf9ae6b0154c0faa0f8fed78531310"} Apr 24 14:24:06.108042 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:06.108004 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d7576e0a-caf3-4817-8250-2b6570598ac0-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-l59rf\" (UID: \"d7576e0a-caf3-4817-8250-2b6570598ac0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l59rf" Apr 24 14:24:06.108217 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:06.108154 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cbc7c460-1a4a-47be-8d93-efd98ee46239-registry-tls\") pod \"image-registry-55d6cb6cf9-tqwbb\" (UID: \"cbc7c460-1a4a-47be-8d93-efd98ee46239\") " pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" Apr 24 14:24:06.108287 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:06.108221 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/55813995-c655-4039-9151-23a5a7023a30-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lq2ln\" (UID: \"55813995-c655-4039-9151-23a5a7023a30\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lq2ln" Apr 24 14:24:06.108375 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:06.108356 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 14:24:06.108464 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:06.108445 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55813995-c655-4039-9151-23a5a7023a30-samples-operator-tls podName:55813995-c655-4039-9151-23a5a7023a30 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:10.108425598 +0000 UTC m=+40.698578299 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/55813995-c655-4039-9151-23a5a7023a30-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-lq2ln" (UID: "55813995-c655-4039-9151-23a5a7023a30") : secret "samples-operator-tls" not found Apr 24 14:24:06.109028 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:06.109007 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 14:24:06.109131 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:06.109066 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7576e0a-caf3-4817-8250-2b6570598ac0-cluster-monitoring-operator-tls podName:d7576e0a-caf3-4817-8250-2b6570598ac0 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:10.10905047 +0000 UTC m=+40.699203172 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d7576e0a-caf3-4817-8250-2b6570598ac0-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-l59rf" (UID: "d7576e0a-caf3-4817-8250-2b6570598ac0") : secret "cluster-monitoring-operator-tls" not found Apr 24 14:24:06.109194 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:06.109135 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 14:24:06.109194 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:06.109145 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55d6cb6cf9-tqwbb: secret "image-registry-tls" not found Apr 24 14:24:06.109289 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:06.109201 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cbc7c460-1a4a-47be-8d93-efd98ee46239-registry-tls podName:cbc7c460-1a4a-47be-8d93-efd98ee46239 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:10.109189779 +0000 UTC m=+40.699342493 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cbc7c460-1a4a-47be-8d93-efd98ee46239-registry-tls") pod "image-registry-55d6cb6cf9-tqwbb" (UID: "cbc7c460-1a4a-47be-8d93-efd98ee46239") : secret "image-registry-tls" not found Apr 24 14:24:06.209561 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:06.209485 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d63229a9-def3-4d90-95d7-748eed4531eb-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wsvfh\" (UID: \"d63229a9-def3-4d90-95d7-748eed4531eb\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wsvfh" Apr 24 14:24:06.209729 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:06.209586 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/743626cc-db6b-4ae5-a8dc-cceaa1cb8be0-metrics-tls\") pod \"dns-default-qcxs6\" (UID: \"743626cc-db6b-4ae5-a8dc-cceaa1cb8be0\") " pod="openshift-dns/dns-default-qcxs6" Apr 24 14:24:06.209840 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:06.209818 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:24:06.209940 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:06.209922 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/743626cc-db6b-4ae5-a8dc-cceaa1cb8be0-metrics-tls podName:743626cc-db6b-4ae5-a8dc-cceaa1cb8be0 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:10.209901053 +0000 UTC m=+40.800053768 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/743626cc-db6b-4ae5-a8dc-cceaa1cb8be0-metrics-tls") pod "dns-default-qcxs6" (UID: "743626cc-db6b-4ae5-a8dc-cceaa1cb8be0") : secret "dns-default-metrics-tls" not found Apr 24 14:24:06.210105 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:06.210089 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 14:24:06.210179 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:06.210147 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d63229a9-def3-4d90-95d7-748eed4531eb-networking-console-plugin-cert podName:d63229a9-def3-4d90-95d7-748eed4531eb nodeName:}" failed. No retries permitted until 2026-04-24 14:24:10.210132531 +0000 UTC m=+40.800285231 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d63229a9-def3-4d90-95d7-748eed4531eb-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-wsvfh" (UID: "d63229a9-def3-4d90-95d7-748eed4531eb") : secret "networking-console-plugin-cert" not found Apr 24 14:24:06.310927 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:06.310888 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d834bc9a-bc43-42cc-82ed-5b3a77d4da5d-cert\") pod \"ingress-canary-95gsb\" (UID: \"d834bc9a-bc43-42cc-82ed-5b3a77d4da5d\") " pod="openshift-ingress-canary/ingress-canary-95gsb" Apr 24 14:24:06.311361 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:06.311081 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:24:06.311361 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:06.311137 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d834bc9a-bc43-42cc-82ed-5b3a77d4da5d-cert podName:d834bc9a-bc43-42cc-82ed-5b3a77d4da5d nodeName:}" failed. No retries permitted until 2026-04-24 14:24:10.311119177 +0000 UTC m=+40.901271910 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d834bc9a-bc43-42cc-82ed-5b3a77d4da5d-cert") pod "ingress-canary-95gsb" (UID: "d834bc9a-bc43-42cc-82ed-5b3a77d4da5d") : secret "canary-serving-cert" not found Apr 24 14:24:10.148895 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:10.148862 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/55813995-c655-4039-9151-23a5a7023a30-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lq2ln\" (UID: \"55813995-c655-4039-9151-23a5a7023a30\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lq2ln" Apr 24 14:24:10.149372 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:10.148922 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d7576e0a-caf3-4817-8250-2b6570598ac0-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-l59rf\" (UID: \"d7576e0a-caf3-4817-8250-2b6570598ac0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l59rf" Apr 24 14:24:10.149372 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:10.149026 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 14:24:10.149372 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:10.149069 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 14:24:10.149372 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:10.149034 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cbc7c460-1a4a-47be-8d93-efd98ee46239-registry-tls\") pod \"image-registry-55d6cb6cf9-tqwbb\" (UID: \"cbc7c460-1a4a-47be-8d93-efd98ee46239\") " pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" Apr 24 14:24:10.149372 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:10.149100 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55813995-c655-4039-9151-23a5a7023a30-samples-operator-tls podName:55813995-c655-4039-9151-23a5a7023a30 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:18.149079883 +0000 UTC m=+48.739232585 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/55813995-c655-4039-9151-23a5a7023a30-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-lq2ln" (UID: "55813995-c655-4039-9151-23a5a7023a30") : secret "samples-operator-tls" not found Apr 24 14:24:10.149372 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:10.149108 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 14:24:10.149372 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:10.149123 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55d6cb6cf9-tqwbb: secret "image-registry-tls" not found Apr 24 14:24:10.149372 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:10.149141 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7576e0a-caf3-4817-8250-2b6570598ac0-cluster-monitoring-operator-tls podName:d7576e0a-caf3-4817-8250-2b6570598ac0 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:18.149119908 +0000 UTC m=+48.739272616 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d7576e0a-caf3-4817-8250-2b6570598ac0-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-l59rf" (UID: "d7576e0a-caf3-4817-8250-2b6570598ac0") : secret "cluster-monitoring-operator-tls" not found Apr 24 14:24:10.149372 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:10.149184 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cbc7c460-1a4a-47be-8d93-efd98ee46239-registry-tls podName:cbc7c460-1a4a-47be-8d93-efd98ee46239 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:18.149174569 +0000 UTC m=+48.739327279 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cbc7c460-1a4a-47be-8d93-efd98ee46239-registry-tls") pod "image-registry-55d6cb6cf9-tqwbb" (UID: "cbc7c460-1a4a-47be-8d93-efd98ee46239") : secret "image-registry-tls" not found Apr 24 14:24:10.250080 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:10.249984 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d63229a9-def3-4d90-95d7-748eed4531eb-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wsvfh\" (UID: \"d63229a9-def3-4d90-95d7-748eed4531eb\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wsvfh" Apr 24 14:24:10.250080 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:10.250030 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/743626cc-db6b-4ae5-a8dc-cceaa1cb8be0-metrics-tls\") pod \"dns-default-qcxs6\" (UID: \"743626cc-db6b-4ae5-a8dc-cceaa1cb8be0\") " pod="openshift-dns/dns-default-qcxs6" Apr 24 14:24:10.250295 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:10.250144 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 14:24:10.250295 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:10.250225 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d63229a9-def3-4d90-95d7-748eed4531eb-networking-console-plugin-cert podName:d63229a9-def3-4d90-95d7-748eed4531eb nodeName:}" failed. No retries permitted until 2026-04-24 14:24:18.250204177 +0000 UTC m=+48.840356883 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d63229a9-def3-4d90-95d7-748eed4531eb-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-wsvfh" (UID: "d63229a9-def3-4d90-95d7-748eed4531eb") : secret "networking-console-plugin-cert" not found Apr 24 14:24:10.250295 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:10.250244 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:24:10.250440 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:10.250320 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/743626cc-db6b-4ae5-a8dc-cceaa1cb8be0-metrics-tls podName:743626cc-db6b-4ae5-a8dc-cceaa1cb8be0 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:18.250301365 +0000 UTC m=+48.840454065 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/743626cc-db6b-4ae5-a8dc-cceaa1cb8be0-metrics-tls") pod "dns-default-qcxs6" (UID: "743626cc-db6b-4ae5-a8dc-cceaa1cb8be0") : secret "dns-default-metrics-tls" not found Apr 24 14:24:10.350872 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:10.350840 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d834bc9a-bc43-42cc-82ed-5b3a77d4da5d-cert\") pod \"ingress-canary-95gsb\" (UID: \"d834bc9a-bc43-42cc-82ed-5b3a77d4da5d\") " pod="openshift-ingress-canary/ingress-canary-95gsb" Apr 24 14:24:10.351080 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:10.350976 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:24:10.351080 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:10.351041 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d834bc9a-bc43-42cc-82ed-5b3a77d4da5d-cert podName:d834bc9a-bc43-42cc-82ed-5b3a77d4da5d nodeName:}" failed. No retries permitted until 2026-04-24 14:24:18.351026438 +0000 UTC m=+48.941179138 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d834bc9a-bc43-42cc-82ed-5b3a77d4da5d-cert") pod "ingress-canary-95gsb" (UID: "d834bc9a-bc43-42cc-82ed-5b3a77d4da5d") : secret "canary-serving-cert" not found Apr 24 14:24:17.339838 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:17.336274 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6dd6f99597-fnwrd" event={"ID":"3a2bcc78-148a-4024-85bf-f0bfdf9c6f93","Type":"ContainerStarted","Data":"02cb1cdba036b9a62e77920b72006796e2aa462dcfa880e9d4cabc1c056d9b66"} Apr 24 14:24:17.341871 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:17.341837 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rt2p5" event={"ID":"5b2b1cb4-7903-4e07-aa4d-25298b638889","Type":"ContainerStarted","Data":"8f277da6483ff3a53688389f0921d952498dc283cf9d7695b14e74292c72796c"} Apr 24 14:24:17.343355 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:17.343319 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79d47f7669-4ff6w" event={"ID":"84bc3c16-3cf7-443f-887d-67eaa7bb7631","Type":"ContainerStarted","Data":"5f18bdbb4a75084bf58a9c3f44b753427828ee97a9469b02ee0c8a607855ea1d"} Apr 24 14:24:17.344624 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:17.344605 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qtdqf" event={"ID":"447bb5cc-c01c-4312-b625-551aa4e765b3","Type":"ContainerStarted","Data":"2f2e98d893b95f0d1935361cc07236304c5ecfee2141eb550fd4b5ecf9fa861a"} Apr 24 14:24:17.346086 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:17.346068 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-57rkt" event={"ID":"f33ebad9-63f4-4a25-865f-68c02ee70c85","Type":"ContainerStarted","Data":"15c01ae69de242319dff14ea919b0c94de02df790a39b98b8041c1fcb1cf2f41"} Apr 24 14:24:17.346218 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:17.346196 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-57rkt" Apr 24 14:24:17.347534 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:17.347507 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xw99m" event={"ID":"0f90d0c4-7a9d-46c4-bb3a-d2d928bb1412","Type":"ContainerStarted","Data":"6b129fc9f948a1bf15aa0b0724e25afeb8e76279e39205cacd6c7997a682fe3b"} Apr 24 14:24:17.349366 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:17.349350 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7979n_fee20684-697d-4937-9bf8-9549ab5442bf/console-operator/0.log" Apr 24 14:24:17.349603 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:17.349573 2567 generic.go:358] "Generic (PLEG): container finished" podID="fee20684-697d-4937-9bf8-9549ab5442bf" containerID="73a47733a07074ebb7eb5dad7c4ff42172d2b892d9b434d01b44bd6ce301c1bd" exitCode=255 Apr 24 14:24:17.350038 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:17.349949 2567 scope.go:117] "RemoveContainer" containerID="73a47733a07074ebb7eb5dad7c4ff42172d2b892d9b434d01b44bd6ce301c1bd" Apr 24 14:24:17.351890 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:17.349963 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-7979n" event={"ID":"fee20684-697d-4937-9bf8-9549ab5442bf","Type":"ContainerDied","Data":"73a47733a07074ebb7eb5dad7c4ff42172d2b892d9b434d01b44bd6ce301c1bd"} Apr 24 14:24:17.354862 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:17.354798 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6dd6f99597-fnwrd" podStartSLOduration=29.798304793 podStartE2EDuration="43.354784436s" podCreationTimestamp="2026-04-24 14:23:34 +0000 UTC" firstStartedPulling="2026-04-24 14:24:03.151113644 +0000 UTC m=+33.741266349" lastFinishedPulling="2026-04-24 14:24:16.707593285 +0000 UTC m=+47.297745992" observedRunningTime="2026-04-24 14:24:17.353177983 +0000 UTC m=+47.943330706" watchObservedRunningTime="2026-04-24 14:24:17.354784436 +0000 UTC m=+47.944937160" Apr 24 14:24:17.355091 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:17.354876 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-28tw4" event={"ID":"efd840e5-21ca-4b28-993c-7e8bd0b9a822","Type":"ContainerStarted","Data":"5d6707f2864c35caacd186670de659955c30698fdf180c3acda28223c01da4cd"} Apr 24 14:24:17.356501 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:17.356479 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6gs2p" event={"ID":"9c15bbb5-ef9b-4df0-b792-073931b97ea8","Type":"ContainerStarted","Data":"bcef90d00792bfb980239a0ae0514a01ce5f2bac885ce2b5c35fe3716a3d4ad4"} Apr 24 14:24:17.357954 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:17.357928 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cdcd8c5b4-fff9m" event={"ID":"52687235-a07e-4c1c-83a6-1f59714d998d","Type":"ContainerStarted","Data":"2f545913b193571faa5f340de3d618cc0b06d57fb943660463b617924909b0b1"} Apr 24 14:24:17.358242 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:17.358224 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cdcd8c5b4-fff9m" Apr 24 14:24:17.360030 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:17.360010 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cdcd8c5b4-fff9m" Apr 24 14:24:17.361020 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:17.361001 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cb58k" event={"ID":"04393dd2-c684-4592-bc88-2223fac95a11","Type":"ContainerStarted","Data":"b38c9671b808309252f6a6cba1a5753d93a4453a588a7c9679ee5815285a4a3f"} Apr 24 14:24:17.371126 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:17.371087 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qtdqf" podStartSLOduration=24.978569543 podStartE2EDuration="38.371075085s" podCreationTimestamp="2026-04-24 14:23:39 +0000 UTC" firstStartedPulling="2026-04-24 14:24:03.18482418 +0000 UTC m=+33.774976885" lastFinishedPulling="2026-04-24 14:24:16.577329723 +0000 UTC m=+47.167482427" observedRunningTime="2026-04-24 14:24:17.36970634 +0000 UTC m=+47.959859063" watchObservedRunningTime="2026-04-24 14:24:17.371075085 +0000 UTC m=+47.961227805" Apr 24 14:24:17.389485 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:17.389435 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-xw99m" podStartSLOduration=36.329641866 podStartE2EDuration="46.389417488s" podCreationTimestamp="2026-04-24 14:23:31 +0000 UTC" firstStartedPulling="2026-04-24 14:24:03.155492585 +0000 UTC m=+33.745645288" lastFinishedPulling="2026-04-24 14:24:13.215268195 +0000 UTC m=+43.805420910" observedRunningTime="2026-04-24 14:24:17.388435366 +0000 UTC m=+47.978588089" watchObservedRunningTime="2026-04-24 14:24:17.389417488 +0000 UTC m=+47.979570212" Apr 24 14:24:17.419966 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:17.419413 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rt2p5" podStartSLOduration=24.908522241 podStartE2EDuration="38.419398361s" podCreationTimestamp="2026-04-24 14:23:39 +0000 UTC" firstStartedPulling="2026-04-24 14:24:03.196994485 +0000 UTC m=+33.787147202" lastFinishedPulling="2026-04-24 14:24:16.707870607 +0000 UTC m=+47.298023322" observedRunningTime="2026-04-24 14:24:17.417832518 +0000 UTC m=+48.007985238" watchObservedRunningTime="2026-04-24 14:24:17.419398361 +0000 UTC m=+48.009551082" Apr 24 14:24:17.433716 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:17.433674 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-57rkt" podStartSLOduration=34.951829422 podStartE2EDuration="47.433656162s" podCreationTimestamp="2026-04-24 14:23:30 +0000 UTC" firstStartedPulling="2026-04-24 14:24:04.236602888 +0000 UTC m=+34.826755595" lastFinishedPulling="2026-04-24 14:24:16.718429633 +0000 UTC m=+47.308582335" observedRunningTime="2026-04-24 14:24:17.432800477 +0000 UTC m=+48.022953202" watchObservedRunningTime="2026-04-24 14:24:17.433656162 +0000 UTC m=+48.023808875" Apr 24 14:24:17.453005 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:17.452968 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-cb58k" podStartSLOduration=16.929199407 podStartE2EDuration="47.452955646s" podCreationTimestamp="2026-04-24 14:23:30 +0000 UTC" firstStartedPulling="2026-04-24 14:23:32.660176264 +0000 UTC m=+3.250328981" lastFinishedPulling="2026-04-24 14:24:03.183932507 +0000 UTC m=+33.774085220" observedRunningTime="2026-04-24 14:24:17.452165992 +0000 UTC m=+48.042318707" watchObservedRunningTime="2026-04-24 14:24:17.452955646 +0000 UTC m=+48.043108368" Apr 24 14:24:17.465834 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:17.465768 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-6gs2p" podStartSLOduration=5.299323404 podStartE2EDuration="17.465750948s" podCreationTimestamp="2026-04-24 14:24:00 +0000 UTC" firstStartedPulling="2026-04-24 14:24:04.542637846 +0000 UTC m=+35.132790550" lastFinishedPulling="2026-04-24 14:24:16.70906539 +0000 UTC m=+47.299218094" observedRunningTime="2026-04-24 14:24:17.465158069 +0000 UTC m=+48.055310792" watchObservedRunningTime="2026-04-24 14:24:17.465750948 +0000 UTC m=+48.055903662" Apr 24 14:24:17.481065 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:17.481012 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cdcd8c5b4-fff9m" podStartSLOduration=29.98415668 podStartE2EDuration="43.480996151s" podCreationTimestamp="2026-04-24 14:23:34 +0000 UTC" firstStartedPulling="2026-04-24 14:24:03.173191414 +0000 UTC m=+33.763344114" lastFinishedPulling="2026-04-24 14:24:16.670030882 +0000 UTC m=+47.260183585" observedRunningTime="2026-04-24 14:24:17.479967002 +0000 UTC m=+48.070119735" watchObservedRunningTime="2026-04-24 14:24:17.480996151 +0000 UTC m=+48.071148875" Apr 24 14:24:17.494785 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:17.494719 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-28tw4" podStartSLOduration=28.010510043 podStartE2EDuration="41.494700733s" podCreationTimestamp="2026-04-24 14:23:36 +0000 UTC" firstStartedPulling="2026-04-24 14:24:03.185838768 +0000 UTC m=+33.775991468" lastFinishedPulling="2026-04-24 14:24:16.670029454 +0000 UTC m=+47.260182158" observedRunningTime="2026-04-24 14:24:17.493569904 +0000 UTC m=+48.083722626" watchObservedRunningTime="2026-04-24 14:24:17.494700733 +0000 UTC m=+48.084853455" Apr 24 14:24:18.217586 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:18.217471 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cbc7c460-1a4a-47be-8d93-efd98ee46239-registry-tls\") pod \"image-registry-55d6cb6cf9-tqwbb\" (UID: \"cbc7c460-1a4a-47be-8d93-efd98ee46239\") " pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" Apr 24 14:24:18.217586 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:18.217572 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/55813995-c655-4039-9151-23a5a7023a30-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lq2ln\" (UID: \"55813995-c655-4039-9151-23a5a7023a30\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lq2ln" Apr 24 14:24:18.217852 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:18.217608 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 14:24:18.217852 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:18.217634 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55d6cb6cf9-tqwbb: secret "image-registry-tls" not found Apr 24 14:24:18.217852 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:18.217634 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d7576e0a-caf3-4817-8250-2b6570598ac0-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-l59rf\" (UID: \"d7576e0a-caf3-4817-8250-2b6570598ac0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l59rf" Apr 24 14:24:18.217852 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:18.217708 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 14:24:18.217852 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:18.217711 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cbc7c460-1a4a-47be-8d93-efd98ee46239-registry-tls podName:cbc7c460-1a4a-47be-8d93-efd98ee46239 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:34.217692403 +0000 UTC m=+64.807845103 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cbc7c460-1a4a-47be-8d93-efd98ee46239-registry-tls") pod "image-registry-55d6cb6cf9-tqwbb" (UID: "cbc7c460-1a4a-47be-8d93-efd98ee46239") : secret "image-registry-tls" not found Apr 24 14:24:18.217852 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:18.217770 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 14:24:18.217852 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:18.217818 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55813995-c655-4039-9151-23a5a7023a30-samples-operator-tls podName:55813995-c655-4039-9151-23a5a7023a30 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:34.21778691 +0000 UTC m=+64.807939610 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/55813995-c655-4039-9151-23a5a7023a30-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-lq2ln" (UID: "55813995-c655-4039-9151-23a5a7023a30") : secret "samples-operator-tls" not found Apr 24 14:24:18.217852 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:18.217843 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7576e0a-caf3-4817-8250-2b6570598ac0-cluster-monitoring-operator-tls podName:d7576e0a-caf3-4817-8250-2b6570598ac0 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:34.217825655 +0000 UTC m=+64.807978359 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d7576e0a-caf3-4817-8250-2b6570598ac0-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-l59rf" (UID: "d7576e0a-caf3-4817-8250-2b6570598ac0") : secret "cluster-monitoring-operator-tls" not found Apr 24 14:24:18.319161 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:18.319121 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d63229a9-def3-4d90-95d7-748eed4531eb-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wsvfh\" (UID: \"d63229a9-def3-4d90-95d7-748eed4531eb\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wsvfh" Apr 24 14:24:18.319330 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:18.319177 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/743626cc-db6b-4ae5-a8dc-cceaa1cb8be0-metrics-tls\") pod \"dns-default-qcxs6\" (UID: \"743626cc-db6b-4ae5-a8dc-cceaa1cb8be0\") " pod="openshift-dns/dns-default-qcxs6" Apr 24 14:24:18.319402 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:18.319387 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:24:18.319454 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:18.319445 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/743626cc-db6b-4ae5-a8dc-cceaa1cb8be0-metrics-tls podName:743626cc-db6b-4ae5-a8dc-cceaa1cb8be0 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:34.319427368 +0000 UTC m=+64.909580075 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/743626cc-db6b-4ae5-a8dc-cceaa1cb8be0-metrics-tls") pod "dns-default-qcxs6" (UID: "743626cc-db6b-4ae5-a8dc-cceaa1cb8be0") : secret "dns-default-metrics-tls" not found Apr 24 14:24:18.319586 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:18.319561 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 14:24:18.319679 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:18.319617 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d63229a9-def3-4d90-95d7-748eed4531eb-networking-console-plugin-cert podName:d63229a9-def3-4d90-95d7-748eed4531eb nodeName:}" failed. No retries permitted until 2026-04-24 14:24:34.319600603 +0000 UTC m=+64.909753305 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d63229a9-def3-4d90-95d7-748eed4531eb-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-wsvfh" (UID: "d63229a9-def3-4d90-95d7-748eed4531eb") : secret "networking-console-plugin-cert" not found Apr 24 14:24:18.366707 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:18.366666 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7979n_fee20684-697d-4937-9bf8-9549ab5442bf/console-operator/1.log" Apr 24 14:24:18.367417 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:18.367392 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7979n_fee20684-697d-4937-9bf8-9549ab5442bf/console-operator/0.log" Apr 24 14:24:18.367531 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:18.367437 2567 generic.go:358] "Generic (PLEG): container finished" podID="fee20684-697d-4937-9bf8-9549ab5442bf" containerID="768bf4ebc76ff843efce22c358673e364cf0bb5995196d5b24cbbd1db7909189" exitCode=255 Apr 24 14:24:18.367599 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:18.367554 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-7979n" event={"ID":"fee20684-697d-4937-9bf8-9549ab5442bf","Type":"ContainerDied","Data":"768bf4ebc76ff843efce22c358673e364cf0bb5995196d5b24cbbd1db7909189"} Apr 24 14:24:18.367652 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:18.367595 2567 scope.go:117] "RemoveContainer" containerID="73a47733a07074ebb7eb5dad7c4ff42172d2b892d9b434d01b44bd6ce301c1bd" Apr 24 14:24:18.367846 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:18.367824 2567 scope.go:117] "RemoveContainer" containerID="768bf4ebc76ff843efce22c358673e364cf0bb5995196d5b24cbbd1db7909189" Apr 24 14:24:18.368070 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:18.368047 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-7979n_openshift-console-operator(fee20684-697d-4937-9bf8-9549ab5442bf)\"" pod="openshift-console-operator/console-operator-9d4b6777b-7979n" podUID="fee20684-697d-4937-9bf8-9549ab5442bf" Apr 24 14:24:18.420858 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:18.420461 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d834bc9a-bc43-42cc-82ed-5b3a77d4da5d-cert\") pod \"ingress-canary-95gsb\" (UID: \"d834bc9a-bc43-42cc-82ed-5b3a77d4da5d\") " pod="openshift-ingress-canary/ingress-canary-95gsb" Apr 24 14:24:18.420858 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:18.420552 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:24:18.420858 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:18.420613 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d834bc9a-bc43-42cc-82ed-5b3a77d4da5d-cert podName:d834bc9a-bc43-42cc-82ed-5b3a77d4da5d nodeName:}" failed. No retries permitted until 2026-04-24 14:24:34.420593315 +0000 UTC m=+65.010746025 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d834bc9a-bc43-42cc-82ed-5b3a77d4da5d-cert") pod "ingress-canary-95gsb" (UID: "d834bc9a-bc43-42cc-82ed-5b3a77d4da5d") : secret "canary-serving-cert" not found Apr 24 14:24:19.371775 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:19.371745 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7979n_fee20684-697d-4937-9bf8-9549ab5442bf/console-operator/1.log" Apr 24 14:24:19.372292 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:19.372273 2567 scope.go:117] "RemoveContainer" containerID="768bf4ebc76ff843efce22c358673e364cf0bb5995196d5b24cbbd1db7909189" Apr 24 14:24:19.372477 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:19.372458 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-7979n_openshift-console-operator(fee20684-697d-4937-9bf8-9549ab5442bf)\"" pod="openshift-console-operator/console-operator-9d4b6777b-7979n" podUID="fee20684-697d-4937-9bf8-9549ab5442bf" Apr 24 14:24:20.377202 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:20.377163 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79d47f7669-4ff6w" event={"ID":"84bc3c16-3cf7-443f-887d-67eaa7bb7631","Type":"ContainerStarted","Data":"ca8301556e9be7816cf2ab3b558fdcd3437abe719de9250745085bf529516ae8"} Apr 24 14:24:20.377202 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:20.377197 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79d47f7669-4ff6w" event={"ID":"84bc3c16-3cf7-443f-887d-67eaa7bb7631","Type":"ContainerStarted","Data":"8fd7731f9f077b9356c07d779e6a41c7814cba0525067bfdf8c31badfcacc312"} Apr 24 14:24:20.385623 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:20.385595 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-6mbjk"] Apr 24 14:24:20.388724 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:20.388705 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-6mbjk" Apr 24 14:24:20.390710 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:20.390683 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-xzpc5\"" Apr 24 14:24:20.390710 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:20.390709 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 24 14:24:20.390910 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:20.390689 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 24 14:24:20.390910 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:20.390721 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 24 14:24:20.390910 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:20.390846 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 24 14:24:20.396852 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:20.396827 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-6mbjk"] Apr 24 14:24:20.407065 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:20.407013 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79d47f7669-4ff6w" podStartSLOduration=29.836656997 podStartE2EDuration="46.406998167s" podCreationTimestamp="2026-04-24 14:23:34 +0000 UTC" firstStartedPulling="2026-04-24 14:24:03.1507122 +0000 UTC m=+33.740864915" lastFinishedPulling="2026-04-24 14:24:19.721053382 +0000 UTC m=+50.311206085" observedRunningTime="2026-04-24 14:24:20.405737759 +0000 UTC m=+50.995890481" watchObservedRunningTime="2026-04-24 14:24:20.406998167 +0000 UTC m=+50.997150888" Apr 24 14:24:20.439928 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:20.439890 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2a479d67-83eb-4a0c-a6ae-94d6f1649b10-signing-key\") pod \"service-ca-865cb79987-6mbjk\" (UID: \"2a479d67-83eb-4a0c-a6ae-94d6f1649b10\") " pod="openshift-service-ca/service-ca-865cb79987-6mbjk" Apr 24 14:24:20.440118 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:20.439973 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2a479d67-83eb-4a0c-a6ae-94d6f1649b10-signing-cabundle\") pod \"service-ca-865cb79987-6mbjk\" (UID: \"2a479d67-83eb-4a0c-a6ae-94d6f1649b10\") " pod="openshift-service-ca/service-ca-865cb79987-6mbjk" Apr 24 14:24:20.440118 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:20.439993 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gfx5\" (UniqueName: \"kubernetes.io/projected/2a479d67-83eb-4a0c-a6ae-94d6f1649b10-kube-api-access-2gfx5\") pod \"service-ca-865cb79987-6mbjk\" (UID: \"2a479d67-83eb-4a0c-a6ae-94d6f1649b10\") " pod="openshift-service-ca/service-ca-865cb79987-6mbjk" Apr 24 14:24:20.541442 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:20.541400 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2a479d67-83eb-4a0c-a6ae-94d6f1649b10-signing-cabundle\") pod \"service-ca-865cb79987-6mbjk\" (UID: \"2a479d67-83eb-4a0c-a6ae-94d6f1649b10\") " pod="openshift-service-ca/service-ca-865cb79987-6mbjk" Apr 24 14:24:20.541442 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:20.541442 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2gfx5\" (UniqueName: \"kubernetes.io/projected/2a479d67-83eb-4a0c-a6ae-94d6f1649b10-kube-api-access-2gfx5\") pod \"service-ca-865cb79987-6mbjk\" (UID: \"2a479d67-83eb-4a0c-a6ae-94d6f1649b10\") " pod="openshift-service-ca/service-ca-865cb79987-6mbjk" Apr 24 14:24:20.541684 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:20.541648 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2a479d67-83eb-4a0c-a6ae-94d6f1649b10-signing-key\") pod \"service-ca-865cb79987-6mbjk\" (UID: \"2a479d67-83eb-4a0c-a6ae-94d6f1649b10\") " pod="openshift-service-ca/service-ca-865cb79987-6mbjk" Apr 24 14:24:20.542215 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:20.542188 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2a479d67-83eb-4a0c-a6ae-94d6f1649b10-signing-cabundle\") pod \"service-ca-865cb79987-6mbjk\" (UID: \"2a479d67-83eb-4a0c-a6ae-94d6f1649b10\") " pod="openshift-service-ca/service-ca-865cb79987-6mbjk" Apr 24 14:24:20.544201 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:20.544177 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2a479d67-83eb-4a0c-a6ae-94d6f1649b10-signing-key\") pod \"service-ca-865cb79987-6mbjk\" (UID: \"2a479d67-83eb-4a0c-a6ae-94d6f1649b10\") " pod="openshift-service-ca/service-ca-865cb79987-6mbjk" Apr 24 14:24:20.548822 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:20.548786 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gfx5\" (UniqueName: \"kubernetes.io/projected/2a479d67-83eb-4a0c-a6ae-94d6f1649b10-kube-api-access-2gfx5\") pod \"service-ca-865cb79987-6mbjk\" (UID: \"2a479d67-83eb-4a0c-a6ae-94d6f1649b10\") " pod="openshift-service-ca/service-ca-865cb79987-6mbjk" Apr 24 14:24:20.702502 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:20.702414 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-6mbjk" Apr 24 14:24:20.823972 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:20.823937 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-6mbjk"] Apr 24 14:24:20.826648 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:24:20.826617 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a479d67_83eb_4a0c_a6ae_94d6f1649b10.slice/crio-dfc1f5ef637ad8f4193d5f92ac38835b64d6858ad393ffca1e1ad2bb58facf8a WatchSource:0}: Error finding container dfc1f5ef637ad8f4193d5f92ac38835b64d6858ad393ffca1e1ad2bb58facf8a: Status 404 returned error can't find the container with id dfc1f5ef637ad8f4193d5f92ac38835b64d6858ad393ffca1e1ad2bb58facf8a Apr 24 14:24:21.380800 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:21.380761 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-6mbjk" event={"ID":"2a479d67-83eb-4a0c-a6ae-94d6f1649b10","Type":"ContainerStarted","Data":"9585da677cb3899b113f1abcf8db8ee36a24923dd7dfdb056c898e30779c1228"} Apr 24 14:24:21.381364 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:21.380821 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-6mbjk" event={"ID":"2a479d67-83eb-4a0c-a6ae-94d6f1649b10","Type":"ContainerStarted","Data":"dfc1f5ef637ad8f4193d5f92ac38835b64d6858ad393ffca1e1ad2bb58facf8a"} Apr 24 14:24:21.396842 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:21.396780 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-6mbjk" podStartSLOduration=1.396764583 podStartE2EDuration="1.396764583s" podCreationTimestamp="2026-04-24 14:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:24:21.395936718 +0000 UTC m=+51.986089432" watchObservedRunningTime="2026-04-24 14:24:21.396764583 +0000 UTC m=+51.986917322" Apr 24 14:24:22.708123 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:22.708078 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-7979n" Apr 24 14:24:22.708603 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:22.708136 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-7979n" Apr 24 14:24:22.708717 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:22.708629 2567 scope.go:117] "RemoveContainer" containerID="768bf4ebc76ff843efce22c358673e364cf0bb5995196d5b24cbbd1db7909189" Apr 24 14:24:22.708883 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:22.708859 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-7979n_openshift-console-operator(fee20684-697d-4937-9bf8-9549ab5442bf)\"" pod="openshift-console-operator/console-operator-9d4b6777b-7979n" podUID="fee20684-697d-4937-9bf8-9549ab5442bf" Apr 24 14:24:28.198283 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:28.198254 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8jd5s" Apr 24 14:24:34.018775 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:34.018746 2567 scope.go:117] "RemoveContainer" containerID="768bf4ebc76ff843efce22c358673e364cf0bb5995196d5b24cbbd1db7909189" Apr 24 14:24:34.262097 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:34.262045 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cbc7c460-1a4a-47be-8d93-efd98ee46239-registry-tls\") pod \"image-registry-55d6cb6cf9-tqwbb\" (UID: \"cbc7c460-1a4a-47be-8d93-efd98ee46239\") " pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" Apr 24 14:24:34.262285 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:34.262128 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/55813995-c655-4039-9151-23a5a7023a30-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lq2ln\" (UID: \"55813995-c655-4039-9151-23a5a7023a30\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lq2ln" Apr 24 14:24:34.262285 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:34.262161 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d7576e0a-caf3-4817-8250-2b6570598ac0-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-l59rf\" (UID: \"d7576e0a-caf3-4817-8250-2b6570598ac0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l59rf" Apr 24 14:24:34.264489 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:34.264459 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/55813995-c655-4039-9151-23a5a7023a30-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lq2ln\" (UID: \"55813995-c655-4039-9151-23a5a7023a30\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lq2ln" Apr 24 14:24:34.264489 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:34.264473 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cbc7c460-1a4a-47be-8d93-efd98ee46239-registry-tls\") pod \"image-registry-55d6cb6cf9-tqwbb\" (UID: \"cbc7c460-1a4a-47be-8d93-efd98ee46239\") " pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" Apr 24 14:24:34.264636 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:34.264478 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d7576e0a-caf3-4817-8250-2b6570598ac0-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-l59rf\" (UID: \"d7576e0a-caf3-4817-8250-2b6570598ac0\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l59rf" Apr 24 14:24:34.363502 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:34.363469 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d63229a9-def3-4d90-95d7-748eed4531eb-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wsvfh\" (UID: \"d63229a9-def3-4d90-95d7-748eed4531eb\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wsvfh" Apr 24 14:24:34.363682 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:34.363511 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/743626cc-db6b-4ae5-a8dc-cceaa1cb8be0-metrics-tls\") pod \"dns-default-qcxs6\" (UID: \"743626cc-db6b-4ae5-a8dc-cceaa1cb8be0\") " pod="openshift-dns/dns-default-qcxs6" Apr 24 14:24:34.365820 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:34.365785 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/743626cc-db6b-4ae5-a8dc-cceaa1cb8be0-metrics-tls\") pod \"dns-default-qcxs6\" (UID: \"743626cc-db6b-4ae5-a8dc-cceaa1cb8be0\") " pod="openshift-dns/dns-default-qcxs6" Apr 24 14:24:34.365933 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:34.365859 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d63229a9-def3-4d90-95d7-748eed4531eb-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wsvfh\" (UID: \"d63229a9-def3-4d90-95d7-748eed4531eb\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wsvfh" Apr 24 14:24:34.414233 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:34.414207 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7979n_fee20684-697d-4937-9bf8-9549ab5442bf/console-operator/2.log" Apr 24 14:24:34.414584 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:34.414568 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7979n_fee20684-697d-4937-9bf8-9549ab5442bf/console-operator/1.log" Apr 24 14:24:34.414636 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:34.414602 2567 generic.go:358] "Generic (PLEG): container finished" podID="fee20684-697d-4937-9bf8-9549ab5442bf" containerID="51b4bf202790d494b22691ed8303bd45ceee6af1077513b5af5336f20c53d163" exitCode=255 Apr 24 14:24:34.414668 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:34.414636 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-7979n" event={"ID":"fee20684-697d-4937-9bf8-9549ab5442bf","Type":"ContainerDied","Data":"51b4bf202790d494b22691ed8303bd45ceee6af1077513b5af5336f20c53d163"} Apr 24 14:24:34.414668 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:34.414663 2567 scope.go:117] "RemoveContainer" containerID="768bf4ebc76ff843efce22c358673e364cf0bb5995196d5b24cbbd1db7909189" Apr 24 14:24:34.415017 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:34.414998 2567 scope.go:117] "RemoveContainer" containerID="51b4bf202790d494b22691ed8303bd45ceee6af1077513b5af5336f20c53d163" Apr 24 14:24:34.415198 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:34.415181 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-7979n_openshift-console-operator(fee20684-697d-4937-9bf8-9549ab5442bf)\"" pod="openshift-console-operator/console-operator-9d4b6777b-7979n" podUID="fee20684-697d-4937-9bf8-9549ab5442bf" Apr 24 14:24:34.434590 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:34.434559 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-vhdj2\"" Apr 24 14:24:34.443214 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:34.443191 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l59rf" Apr 24 14:24:34.457086 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:34.457067 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-vfj9f\"" Apr 24 14:24:34.464067 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:34.464040 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d834bc9a-bc43-42cc-82ed-5b3a77d4da5d-cert\") pod \"ingress-canary-95gsb\" (UID: \"d834bc9a-bc43-42cc-82ed-5b3a77d4da5d\") " pod="openshift-ingress-canary/ingress-canary-95gsb" Apr 24 14:24:34.465486 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:34.465466 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" Apr 24 14:24:34.466385 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:34.466367 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d834bc9a-bc43-42cc-82ed-5b3a77d4da5d-cert\") pod \"ingress-canary-95gsb\" (UID: \"d834bc9a-bc43-42cc-82ed-5b3a77d4da5d\") " pod="openshift-ingress-canary/ingress-canary-95gsb" Apr 24 14:24:34.497739 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:34.497706 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-8bls2\"" Apr 24 14:24:34.506236 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:34.506197 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lq2ln" Apr 24 14:24:34.521554 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:34.521519 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vrfmr\"" Apr 24 14:24:34.538153 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:34.530990 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qcxs6" Apr 24 14:24:34.586388 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:34.586174 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-gjz9f\"" Apr 24 14:24:34.594224 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:34.594140 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wsvfh" Apr 24 14:24:34.601877 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:34.601795 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-l59rf"] Apr 24 14:24:34.605322 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:24:34.605294 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7576e0a_caf3_4817_8250_2b6570598ac0.slice/crio-1ef69c02707e63e76f10864ac911700c846a751ba8df8b802853cce46257b699 WatchSource:0}: Error finding container 1ef69c02707e63e76f10864ac911700c846a751ba8df8b802853cce46257b699: Status 404 returned error can't find the container with id 1ef69c02707e63e76f10864ac911700c846a751ba8df8b802853cce46257b699 Apr 24 14:24:34.634878 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:34.634798 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-55d6cb6cf9-tqwbb"] Apr 24 14:24:34.642498 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:24:34.642467 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbc7c460_1a4a_47be_8d93_efd98ee46239.slice/crio-24e989e8b2085a04478fb15aba8aabbb25491206a82c0f87203926dddf34b6ef WatchSource:0}: Error finding container 24e989e8b2085a04478fb15aba8aabbb25491206a82c0f87203926dddf34b6ef: Status 404 returned error can't find the container with id 24e989e8b2085a04478fb15aba8aabbb25491206a82c0f87203926dddf34b6ef Apr 24 14:24:34.653866 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:34.653791 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9tl5g\"" Apr 24 14:24:34.664212 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:34.663755 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-95gsb" Apr 24 14:24:34.665320 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:34.665255 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lq2ln"] Apr 24 14:24:34.709703 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:34.709672 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qcxs6"] Apr 24 14:24:34.722894 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:24:34.721820 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod743626cc_db6b_4ae5_a8dc_cceaa1cb8be0.slice/crio-926f3657aac08ff535387e4d792617145c7acb75797c6b65f19fc5068766c1f6 WatchSource:0}: Error finding container 926f3657aac08ff535387e4d792617145c7acb75797c6b65f19fc5068766c1f6: Status 404 returned error can't find the container with id 926f3657aac08ff535387e4d792617145c7acb75797c6b65f19fc5068766c1f6 Apr 24 14:24:34.744345 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:34.744123 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-wsvfh"] Apr 24 14:24:34.747319 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:24:34.747285 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd63229a9_def3_4d90_95d7_748eed4531eb.slice/crio-c41aae0bb42d7f187efb39340d576e3e56771b645ab7e4031cc70c06c192d9d7 WatchSource:0}: Error finding container c41aae0bb42d7f187efb39340d576e3e56771b645ab7e4031cc70c06c192d9d7: Status 404 returned error can't find the container with id c41aae0bb42d7f187efb39340d576e3e56771b645ab7e4031cc70c06c192d9d7 Apr 24 14:24:34.817526 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:34.817497 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-95gsb"] Apr 24 14:24:34.820730 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:24:34.820705 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd834bc9a_bc43_42cc_82ed_5b3a77d4da5d.slice/crio-78c89bffd6c7b56107ae85e74fc55f7d144687cc70b162bea79021e97e979d00 WatchSource:0}: Error finding container 78c89bffd6c7b56107ae85e74fc55f7d144687cc70b162bea79021e97e979d00: Status 404 returned error can't find the container with id 78c89bffd6c7b56107ae85e74fc55f7d144687cc70b162bea79021e97e979d00 Apr 24 14:24:35.429370 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:35.429333 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wsvfh" event={"ID":"d63229a9-def3-4d90-95d7-748eed4531eb","Type":"ContainerStarted","Data":"c41aae0bb42d7f187efb39340d576e3e56771b645ab7e4031cc70c06c192d9d7"} Apr 24 14:24:35.430653 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:35.430604 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qcxs6" event={"ID":"743626cc-db6b-4ae5-a8dc-cceaa1cb8be0","Type":"ContainerStarted","Data":"926f3657aac08ff535387e4d792617145c7acb75797c6b65f19fc5068766c1f6"} Apr 24 14:24:35.432263 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:35.432210 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lq2ln" event={"ID":"55813995-c655-4039-9151-23a5a7023a30","Type":"ContainerStarted","Data":"3a5285d6af76cb2f0028da93e3e39f5ae39d4e2c38439bf8c67f846a3aa2e580"} Apr 24 14:24:35.441832 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:35.441790 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7979n_fee20684-697d-4937-9bf8-9549ab5442bf/console-operator/2.log" Apr 24 14:24:35.445869 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:35.445497 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" event={"ID":"cbc7c460-1a4a-47be-8d93-efd98ee46239","Type":"ContainerStarted","Data":"e1ef6b5cf29a66cf9ccdb5ac8e912e73dd9acb1c611d2f42a51c0dafd9c2702e"} Apr 24 14:24:35.445869 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:35.445532 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" event={"ID":"cbc7c460-1a4a-47be-8d93-efd98ee46239","Type":"ContainerStarted","Data":"24e989e8b2085a04478fb15aba8aabbb25491206a82c0f87203926dddf34b6ef"} Apr 24 14:24:35.446033 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:35.445995 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" Apr 24 14:24:35.447460 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:35.447421 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-95gsb" event={"ID":"d834bc9a-bc43-42cc-82ed-5b3a77d4da5d","Type":"ContainerStarted","Data":"78c89bffd6c7b56107ae85e74fc55f7d144687cc70b162bea79021e97e979d00"} Apr 24 14:24:35.448876 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:35.448854 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l59rf" event={"ID":"d7576e0a-caf3-4817-8250-2b6570598ac0","Type":"ContainerStarted","Data":"1ef69c02707e63e76f10864ac911700c846a751ba8df8b802853cce46257b699"} Apr 24 14:24:35.466578 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:35.465571 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" podStartSLOduration=65.465551354 podStartE2EDuration="1m5.465551354s" podCreationTimestamp="2026-04-24 14:23:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:24:35.464294639 +0000 UTC m=+66.054447374" watchObservedRunningTime="2026-04-24 14:24:35.465551354 +0000 UTC m=+66.055704077" Apr 24 14:24:35.775780 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:35.775698 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/952d5757-28bc-4940-9fa6-4a50ffff6476-metrics-certs\") pod \"network-metrics-daemon-ct9nz\" (UID: \"952d5757-28bc-4940-9fa6-4a50ffff6476\") " pod="openshift-multus/network-metrics-daemon-ct9nz" Apr 24 14:24:35.786710 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:35.786118 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 14:24:35.805126 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:35.805097 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/952d5757-28bc-4940-9fa6-4a50ffff6476-metrics-certs\") pod \"network-metrics-daemon-ct9nz\" (UID: \"952d5757-28bc-4940-9fa6-4a50ffff6476\") " pod="openshift-multus/network-metrics-daemon-ct9nz" Apr 24 14:24:35.842068 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:35.841852 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-p2dnd\"" Apr 24 14:24:35.850846 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:35.850404 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ct9nz" Apr 24 14:24:36.036386 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:36.036286 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ct9nz"] Apr 24 14:24:36.453055 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:36.453021 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ct9nz" event={"ID":"952d5757-28bc-4940-9fa6-4a50ffff6476","Type":"ContainerStarted","Data":"0979839f801f17a07ac991a11f6c5593d7a0f05205ad65e18cd83533d43f7195"} Apr 24 14:24:39.416006 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.415971 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-jvth8"] Apr 24 14:24:39.419452 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.419427 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-jvth8" Apr 24 14:24:39.421921 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.421891 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 14:24:39.422601 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.422579 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-9fmsb\"" Apr 24 14:24:39.422720 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.422628 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 24 14:24:39.422720 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.422692 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 24 14:24:39.424188 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.424166 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 14:24:39.426462 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.426441 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 24 14:24:39.437615 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.437590 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-jvth8"] Apr 24 14:24:39.465462 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.465426 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-95gsb" event={"ID":"d834bc9a-bc43-42cc-82ed-5b3a77d4da5d","Type":"ContainerStarted","Data":"cc141c29cd8a6967cc18e2c15e72e30a3901bf3204a6567bb5dc74a718f82872"} Apr 24 14:24:39.467767 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.467737 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ct9nz" event={"ID":"952d5757-28bc-4940-9fa6-4a50ffff6476","Type":"ContainerStarted","Data":"9dff3c2e59ef3237c452af9e14835c72789794cb2752dc8643711ffd2a5ca785"} Apr 24 14:24:39.467961 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.467773 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ct9nz" event={"ID":"952d5757-28bc-4940-9fa6-4a50ffff6476","Type":"ContainerStarted","Data":"37e98b173ce79bce11db1d860ff6ba69d301f622a2062aacc6619e306432055f"} Apr 24 14:24:39.469274 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.469249 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wsvfh" event={"ID":"d63229a9-def3-4d90-95d7-748eed4531eb","Type":"ContainerStarted","Data":"0d17e39ffcba0d0a740e609eb738a50d40e68017bab2aea4fc57db773e307088"} Apr 24 14:24:39.470970 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.470944 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lq2ln" event={"ID":"55813995-c655-4039-9151-23a5a7023a30","Type":"ContainerStarted","Data":"cba1afdf57d5fafe144d77ee6d50ff7d07d3803df3d95b537e7a4cbf8926cee0"} Apr 24 14:24:39.471070 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.470976 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lq2ln" event={"ID":"55813995-c655-4039-9151-23a5a7023a30","Type":"ContainerStarted","Data":"455bc93e7c454ba192c37097b84a8a6c3d59e292474bf3ce4fe6abf1339c05e5"} Apr 24 14:24:39.472281 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.472260 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l59rf" event={"ID":"d7576e0a-caf3-4817-8250-2b6570598ac0","Type":"ContainerStarted","Data":"4109702dda7cb737f00abf4bf9b8a2a5cf3a94ebce0b866b0cb97f0276db7878"} Apr 24 14:24:39.473789 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.473769 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qcxs6" event={"ID":"743626cc-db6b-4ae5-a8dc-cceaa1cb8be0","Type":"ContainerStarted","Data":"895c1df8264da9aa708ae23569d9fca75f9057ef56095ee7fc324d966a81a19f"} Apr 24 14:24:39.473896 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.473793 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qcxs6" event={"ID":"743626cc-db6b-4ae5-a8dc-cceaa1cb8be0","Type":"ContainerStarted","Data":"98d44fea33c6e9464ca7072a1d2a0ce8173f6f3871d58e935f04b1febc05a761"} Apr 24 14:24:39.473945 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.473937 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-qcxs6" Apr 24 14:24:39.495981 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.495925 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-95gsb" podStartSLOduration=33.557941427 podStartE2EDuration="37.495909265s" podCreationTimestamp="2026-04-24 14:24:02 +0000 UTC" firstStartedPulling="2026-04-24 14:24:34.822464206 +0000 UTC m=+65.412616906" lastFinishedPulling="2026-04-24 14:24:38.760432029 +0000 UTC m=+69.350584744" observedRunningTime="2026-04-24 14:24:39.49420982 +0000 UTC m=+70.084362542" watchObservedRunningTime="2026-04-24 14:24:39.495909265 +0000 UTC m=+70.086061987" Apr 24 14:24:39.499893 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.499864 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8gtlg"] Apr 24 14:24:39.503091 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.503070 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-78bb745fb6-x54tc"] Apr 24 14:24:39.503242 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.503226 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8gtlg" Apr 24 14:24:39.506439 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.506415 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-cwc6x"] Apr 24 14:24:39.507067 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.506975 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 24 14:24:39.507182 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.507132 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-vvqt8\"" Apr 24 14:24:39.507368 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.507346 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-78bb745fb6-x54tc" Apr 24 14:24:39.509715 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.509693 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 24 14:24:39.509877 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.509771 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-pmxd7\"" Apr 24 14:24:39.509937 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.509700 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 24 14:24:39.510281 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.510122 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 24 14:24:39.510281 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.510139 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 24 14:24:39.510281 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.510164 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 24 14:24:39.510507 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.510490 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cwc6x" Apr 24 14:24:39.511027 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.511012 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 24 14:24:39.511211 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.511194 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a1850b6-1bc3-4ff6-a567-7c2e8813a59d-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-jvth8\" (UID: \"0a1850b6-1bc3-4ff6-a567-7c2e8813a59d\") " pod="openshift-insights/insights-operator-585dfdc468-jvth8" Apr 24 14:24:39.511278 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.511236 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/0a1850b6-1bc3-4ff6-a567-7c2e8813a59d-snapshots\") pod \"insights-operator-585dfdc468-jvth8\" (UID: \"0a1850b6-1bc3-4ff6-a567-7c2e8813a59d\") " pod="openshift-insights/insights-operator-585dfdc468-jvth8" Apr 24 14:24:39.511278 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.511265 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a1850b6-1bc3-4ff6-a567-7c2e8813a59d-serving-cert\") pod \"insights-operator-585dfdc468-jvth8\" (UID: \"0a1850b6-1bc3-4ff6-a567-7c2e8813a59d\") " pod="openshift-insights/insights-operator-585dfdc468-jvth8" Apr 24 14:24:39.511371 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.511292 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0a1850b6-1bc3-4ff6-a567-7c2e8813a59d-tmp\") pod \"insights-operator-585dfdc468-jvth8\" (UID: \"0a1850b6-1bc3-4ff6-a567-7c2e8813a59d\") " pod="openshift-insights/insights-operator-585dfdc468-jvth8" Apr 24 14:24:39.511423 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.511383 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a1850b6-1bc3-4ff6-a567-7c2e8813a59d-service-ca-bundle\") pod \"insights-operator-585dfdc468-jvth8\" (UID: \"0a1850b6-1bc3-4ff6-a567-7c2e8813a59d\") " pod="openshift-insights/insights-operator-585dfdc468-jvth8" Apr 24 14:24:39.511469 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.511422 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v5x5\" (UniqueName: \"kubernetes.io/projected/0a1850b6-1bc3-4ff6-a567-7c2e8813a59d-kube-api-access-8v5x5\") pod \"insights-operator-585dfdc468-jvth8\" (UID: \"0a1850b6-1bc3-4ff6-a567-7c2e8813a59d\") " pod="openshift-insights/insights-operator-585dfdc468-jvth8" Apr 24 14:24:39.512657 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.512638 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-t7shp\"" Apr 24 14:24:39.512977 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.512954 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 24 14:24:39.513076 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.512996 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 24 14:24:39.517472 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.517434 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8gtlg"] Apr 24 14:24:39.520883 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.520016 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-cwc6x"] Apr 24 14:24:39.528880 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.528797 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-78bb745fb6-x54tc"] Apr 24 14:24:39.531858 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.530474 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-ct9nz" podStartSLOduration=66.75248921 podStartE2EDuration="1m9.530455792s" podCreationTimestamp="2026-04-24 14:23:30 +0000 UTC" firstStartedPulling="2026-04-24 14:24:36.048066931 +0000 UTC m=+66.638219645" lastFinishedPulling="2026-04-24 14:24:38.826033519 +0000 UTC m=+69.416186227" observedRunningTime="2026-04-24 14:24:39.525724913 +0000 UTC m=+70.115877637" watchObservedRunningTime="2026-04-24 14:24:39.530455792 +0000 UTC m=+70.120608516" Apr 24 14:24:39.554200 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.554140 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l59rf" podStartSLOduration=65.40328318 podStartE2EDuration="1m9.554119159s" podCreationTimestamp="2026-04-24 14:23:30 +0000 UTC" firstStartedPulling="2026-04-24 14:24:34.609401631 +0000 UTC m=+65.199554337" lastFinishedPulling="2026-04-24 14:24:38.760237604 +0000 UTC m=+69.350390316" observedRunningTime="2026-04-24 14:24:39.553469554 +0000 UTC m=+70.143622276" watchObservedRunningTime="2026-04-24 14:24:39.554119159 +0000 UTC m=+70.144271884" Apr 24 14:24:39.578899 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.578829 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-qcxs6" podStartSLOduration=33.542202253 podStartE2EDuration="37.578795289s" podCreationTimestamp="2026-04-24 14:24:02 +0000 UTC" firstStartedPulling="2026-04-24 14:24:34.723825753 +0000 UTC m=+65.313978453" lastFinishedPulling="2026-04-24 14:24:38.760418788 +0000 UTC m=+69.350571489" observedRunningTime="2026-04-24 14:24:39.57649432 +0000 UTC m=+70.166647065" watchObservedRunningTime="2026-04-24 14:24:39.578795289 +0000 UTC m=+70.168948011" Apr 24 14:24:39.604659 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.604598 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lq2ln" podStartSLOduration=63.604895632 podStartE2EDuration="1m7.6045796s" podCreationTimestamp="2026-04-24 14:23:32 +0000 UTC" firstStartedPulling="2026-04-24 14:24:34.761015874 +0000 UTC m=+65.351168578" lastFinishedPulling="2026-04-24 14:24:38.760699837 +0000 UTC m=+69.350852546" observedRunningTime="2026-04-24 14:24:39.60344006 +0000 UTC m=+70.193592781" watchObservedRunningTime="2026-04-24 14:24:39.6045796 +0000 UTC m=+70.194732324" Apr 24 14:24:39.612071 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.612028 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmfwq\" (UniqueName: \"kubernetes.io/projected/114b4da4-48aa-45ef-9304-25ef4821570d-kube-api-access-tmfwq\") pod \"migrator-74bb7799d9-cwc6x\" (UID: \"114b4da4-48aa-45ef-9304-25ef4821570d\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cwc6x" Apr 24 14:24:39.612071 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.612071 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/0a1850b6-1bc3-4ff6-a567-7c2e8813a59d-snapshots\") pod \"insights-operator-585dfdc468-jvth8\" (UID: \"0a1850b6-1bc3-4ff6-a567-7c2e8813a59d\") " pod="openshift-insights/insights-operator-585dfdc468-jvth8" Apr 24 14:24:39.612283 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.612093 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/46136cc4-31fa-4d7b-a223-af0007d2fd5a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-8gtlg\" (UID: \"46136cc4-31fa-4d7b-a223-af0007d2fd5a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8gtlg" Apr 24 14:24:39.612283 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.612113 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e88e3b18-0568-4c3d-9672-99f01f7456b4-service-ca-bundle\") pod \"router-default-78bb745fb6-x54tc\" (UID: \"e88e3b18-0568-4c3d-9672-99f01f7456b4\") " pod="openshift-ingress/router-default-78bb745fb6-x54tc" Apr 24 14:24:39.612283 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.612133 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mcpw\" (UniqueName: \"kubernetes.io/projected/e88e3b18-0568-4c3d-9672-99f01f7456b4-kube-api-access-8mcpw\") pod \"router-default-78bb745fb6-x54tc\" (UID: \"e88e3b18-0568-4c3d-9672-99f01f7456b4\") " pod="openshift-ingress/router-default-78bb745fb6-x54tc" Apr 24 14:24:39.612283 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.612191 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a1850b6-1bc3-4ff6-a567-7c2e8813a59d-serving-cert\") pod \"insights-operator-585dfdc468-jvth8\" (UID: \"0a1850b6-1bc3-4ff6-a567-7c2e8813a59d\") " pod="openshift-insights/insights-operator-585dfdc468-jvth8" Apr 24 14:24:39.612283 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.612216 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0a1850b6-1bc3-4ff6-a567-7c2e8813a59d-tmp\") pod \"insights-operator-585dfdc468-jvth8\" (UID: \"0a1850b6-1bc3-4ff6-a567-7c2e8813a59d\") " pod="openshift-insights/insights-operator-585dfdc468-jvth8" Apr 24 14:24:39.614370 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.612608 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a1850b6-1bc3-4ff6-a567-7c2e8813a59d-service-ca-bundle\") pod \"insights-operator-585dfdc468-jvth8\" (UID: \"0a1850b6-1bc3-4ff6-a567-7c2e8813a59d\") " pod="openshift-insights/insights-operator-585dfdc468-jvth8" Apr 24 14:24:39.614370 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.612679 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8v5x5\" (UniqueName: \"kubernetes.io/projected/0a1850b6-1bc3-4ff6-a567-7c2e8813a59d-kube-api-access-8v5x5\") pod \"insights-operator-585dfdc468-jvth8\" (UID: \"0a1850b6-1bc3-4ff6-a567-7c2e8813a59d\") " pod="openshift-insights/insights-operator-585dfdc468-jvth8" Apr 24 14:24:39.614370 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.612700 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0a1850b6-1bc3-4ff6-a567-7c2e8813a59d-tmp\") pod \"insights-operator-585dfdc468-jvth8\" (UID: \"0a1850b6-1bc3-4ff6-a567-7c2e8813a59d\") " pod="openshift-insights/insights-operator-585dfdc468-jvth8" Apr 24 14:24:39.614370 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.612781 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e88e3b18-0568-4c3d-9672-99f01f7456b4-metrics-certs\") pod \"router-default-78bb745fb6-x54tc\" (UID: \"e88e3b18-0568-4c3d-9672-99f01f7456b4\") " pod="openshift-ingress/router-default-78bb745fb6-x54tc" Apr 24 14:24:39.614370 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.612791 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/0a1850b6-1bc3-4ff6-a567-7c2e8813a59d-snapshots\") pod \"insights-operator-585dfdc468-jvth8\" (UID: \"0a1850b6-1bc3-4ff6-a567-7c2e8813a59d\") " pod="openshift-insights/insights-operator-585dfdc468-jvth8" Apr 24 14:24:39.614370 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.612872 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a1850b6-1bc3-4ff6-a567-7c2e8813a59d-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-jvth8\" (UID: \"0a1850b6-1bc3-4ff6-a567-7c2e8813a59d\") " pod="openshift-insights/insights-operator-585dfdc468-jvth8" Apr 24 14:24:39.614370 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.612906 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e88e3b18-0568-4c3d-9672-99f01f7456b4-stats-auth\") pod \"router-default-78bb745fb6-x54tc\" (UID: \"e88e3b18-0568-4c3d-9672-99f01f7456b4\") " pod="openshift-ingress/router-default-78bb745fb6-x54tc" Apr 24 14:24:39.614370 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.613306 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e88e3b18-0568-4c3d-9672-99f01f7456b4-default-certificate\") pod \"router-default-78bb745fb6-x54tc\" (UID: \"e88e3b18-0568-4c3d-9672-99f01f7456b4\") " pod="openshift-ingress/router-default-78bb745fb6-x54tc" Apr 24 14:24:39.614370 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.613571 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a1850b6-1bc3-4ff6-a567-7c2e8813a59d-service-ca-bundle\") pod \"insights-operator-585dfdc468-jvth8\" (UID: \"0a1850b6-1bc3-4ff6-a567-7c2e8813a59d\") " pod="openshift-insights/insights-operator-585dfdc468-jvth8" Apr 24 14:24:39.614981 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.614751 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a1850b6-1bc3-4ff6-a567-7c2e8813a59d-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-jvth8\" (UID: \"0a1850b6-1bc3-4ff6-a567-7c2e8813a59d\") " pod="openshift-insights/insights-operator-585dfdc468-jvth8" Apr 24 14:24:39.616354 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.616336 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a1850b6-1bc3-4ff6-a567-7c2e8813a59d-serving-cert\") pod \"insights-operator-585dfdc468-jvth8\" (UID: \"0a1850b6-1bc3-4ff6-a567-7c2e8813a59d\") " pod="openshift-insights/insights-operator-585dfdc468-jvth8" Apr 24 14:24:39.623791 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.623764 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v5x5\" (UniqueName: \"kubernetes.io/projected/0a1850b6-1bc3-4ff6-a567-7c2e8813a59d-kube-api-access-8v5x5\") pod \"insights-operator-585dfdc468-jvth8\" (UID: \"0a1850b6-1bc3-4ff6-a567-7c2e8813a59d\") " pod="openshift-insights/insights-operator-585dfdc468-jvth8" Apr 24 14:24:39.630913 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.630763 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wsvfh" podStartSLOduration=53.619641395 podStartE2EDuration="57.630725956s" podCreationTimestamp="2026-04-24 14:23:42 +0000 UTC" firstStartedPulling="2026-04-24 14:24:34.749323015 +0000 UTC m=+65.339475716" lastFinishedPulling="2026-04-24 14:24:38.760407568 +0000 UTC m=+69.350560277" observedRunningTime="2026-04-24 14:24:39.630296498 +0000 UTC m=+70.220449219" watchObservedRunningTime="2026-04-24 14:24:39.630725956 +0000 UTC m=+70.220878679" Apr 24 14:24:39.714392 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.714298 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e88e3b18-0568-4c3d-9672-99f01f7456b4-metrics-certs\") pod \"router-default-78bb745fb6-x54tc\" (UID: \"e88e3b18-0568-4c3d-9672-99f01f7456b4\") " pod="openshift-ingress/router-default-78bb745fb6-x54tc" Apr 24 14:24:39.714392 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.714348 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e88e3b18-0568-4c3d-9672-99f01f7456b4-stats-auth\") pod \"router-default-78bb745fb6-x54tc\" (UID: \"e88e3b18-0568-4c3d-9672-99f01f7456b4\") " pod="openshift-ingress/router-default-78bb745fb6-x54tc" Apr 24 14:24:39.714392 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.714376 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e88e3b18-0568-4c3d-9672-99f01f7456b4-default-certificate\") pod \"router-default-78bb745fb6-x54tc\" (UID: \"e88e3b18-0568-4c3d-9672-99f01f7456b4\") " pod="openshift-ingress/router-default-78bb745fb6-x54tc" Apr 24 14:24:39.714671 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.714406 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tmfwq\" (UniqueName: \"kubernetes.io/projected/114b4da4-48aa-45ef-9304-25ef4821570d-kube-api-access-tmfwq\") pod \"migrator-74bb7799d9-cwc6x\" (UID: \"114b4da4-48aa-45ef-9304-25ef4821570d\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cwc6x" Apr 24 14:24:39.714671 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.714438 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/46136cc4-31fa-4d7b-a223-af0007d2fd5a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-8gtlg\" (UID: \"46136cc4-31fa-4d7b-a223-af0007d2fd5a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8gtlg" Apr 24 14:24:39.714671 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.714460 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e88e3b18-0568-4c3d-9672-99f01f7456b4-service-ca-bundle\") pod \"router-default-78bb745fb6-x54tc\" (UID: \"e88e3b18-0568-4c3d-9672-99f01f7456b4\") " pod="openshift-ingress/router-default-78bb745fb6-x54tc" Apr 24 14:24:39.714671 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.714484 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8mcpw\" (UniqueName: \"kubernetes.io/projected/e88e3b18-0568-4c3d-9672-99f01f7456b4-kube-api-access-8mcpw\") pod \"router-default-78bb745fb6-x54tc\" (UID: \"e88e3b18-0568-4c3d-9672-99f01f7456b4\") " pod="openshift-ingress/router-default-78bb745fb6-x54tc" Apr 24 14:24:39.715419 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.715394 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e88e3b18-0568-4c3d-9672-99f01f7456b4-service-ca-bundle\") pod \"router-default-78bb745fb6-x54tc\" (UID: \"e88e3b18-0568-4c3d-9672-99f01f7456b4\") " pod="openshift-ingress/router-default-78bb745fb6-x54tc" Apr 24 14:24:39.716922 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.716893 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e88e3b18-0568-4c3d-9672-99f01f7456b4-stats-auth\") pod \"router-default-78bb745fb6-x54tc\" (UID: \"e88e3b18-0568-4c3d-9672-99f01f7456b4\") " pod="openshift-ingress/router-default-78bb745fb6-x54tc" Apr 24 14:24:39.717303 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.717283 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e88e3b18-0568-4c3d-9672-99f01f7456b4-metrics-certs\") pod \"router-default-78bb745fb6-x54tc\" (UID: \"e88e3b18-0568-4c3d-9672-99f01f7456b4\") " pod="openshift-ingress/router-default-78bb745fb6-x54tc" Apr 24 14:24:39.717347 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.717283 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/46136cc4-31fa-4d7b-a223-af0007d2fd5a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-8gtlg\" (UID: \"46136cc4-31fa-4d7b-a223-af0007d2fd5a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8gtlg" Apr 24 14:24:39.717388 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.717373 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e88e3b18-0568-4c3d-9672-99f01f7456b4-default-certificate\") pod \"router-default-78bb745fb6-x54tc\" (UID: \"e88e3b18-0568-4c3d-9672-99f01f7456b4\") " pod="openshift-ingress/router-default-78bb745fb6-x54tc" Apr 24 14:24:39.723672 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.723649 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmfwq\" (UniqueName: \"kubernetes.io/projected/114b4da4-48aa-45ef-9304-25ef4821570d-kube-api-access-tmfwq\") pod \"migrator-74bb7799d9-cwc6x\" (UID: \"114b4da4-48aa-45ef-9304-25ef4821570d\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cwc6x" Apr 24 14:24:39.723783 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.723760 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mcpw\" (UniqueName: \"kubernetes.io/projected/e88e3b18-0568-4c3d-9672-99f01f7456b4-kube-api-access-8mcpw\") pod \"router-default-78bb745fb6-x54tc\" (UID: \"e88e3b18-0568-4c3d-9672-99f01f7456b4\") " pod="openshift-ingress/router-default-78bb745fb6-x54tc" Apr 24 14:24:39.731535 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.731511 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-jvth8" Apr 24 14:24:39.821399 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.821366 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8gtlg" Apr 24 14:24:39.833171 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.833127 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-78bb745fb6-x54tc" Apr 24 14:24:39.840510 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.840486 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cwc6x" Apr 24 14:24:39.880080 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.880012 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-jvth8"] Apr 24 14:24:39.887908 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:24:39.887273 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a1850b6_1bc3_4ff6_a567_7c2e8813a59d.slice/crio-e6023e239d61e38a45f93742ec7d2e03f63d2aed9a21672d80f2bd0095598092 WatchSource:0}: Error finding container e6023e239d61e38a45f93742ec7d2e03f63d2aed9a21672d80f2bd0095598092: Status 404 returned error can't find the container with id e6023e239d61e38a45f93742ec7d2e03f63d2aed9a21672d80f2bd0095598092 Apr 24 14:24:39.973067 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:39.973038 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8gtlg"] Apr 24 14:24:39.976207 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:24:39.976176 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46136cc4_31fa_4d7b_a223_af0007d2fd5a.slice/crio-78add2524985e474b31433d6092a1675c06b7ea57da65ae0b4309b3e9bfc4d6f WatchSource:0}: Error finding container 78add2524985e474b31433d6092a1675c06b7ea57da65ae0b4309b3e9bfc4d6f: Status 404 returned error can't find the container with id 78add2524985e474b31433d6092a1675c06b7ea57da65ae0b4309b3e9bfc4d6f Apr 24 14:24:40.197490 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:40.197439 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-cwc6x"] Apr 24 14:24:40.199081 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:40.199053 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-78bb745fb6-x54tc"] Apr 24 14:24:40.200184 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:24:40.200150 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod114b4da4_48aa_45ef_9304_25ef4821570d.slice/crio-6cbe5a902030446335155f61a3321c48177a81847452c9c6676e49edad5ea1b5 WatchSource:0}: Error finding container 6cbe5a902030446335155f61a3321c48177a81847452c9c6676e49edad5ea1b5: Status 404 returned error can't find the container with id 6cbe5a902030446335155f61a3321c48177a81847452c9c6676e49edad5ea1b5 Apr 24 14:24:40.202362 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:24:40.202347 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode88e3b18_0568_4c3d_9672_99f01f7456b4.slice/crio-1b81644364e4934f405a2279c9d2c28860ab35a9505d1551c015b752eb9c4f15 WatchSource:0}: Error finding container 1b81644364e4934f405a2279c9d2c28860ab35a9505d1551c015b752eb9c4f15: Status 404 returned error can't find the container with id 1b81644364e4934f405a2279c9d2c28860ab35a9505d1551c015b752eb9c4f15 Apr 24 14:24:40.479112 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:40.479076 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-78bb745fb6-x54tc" event={"ID":"e88e3b18-0568-4c3d-9672-99f01f7456b4","Type":"ContainerStarted","Data":"6b1be67af856ab9ddf8bf3bca9c3c051f6d361d1140bb602dc919d49e4f3dafb"} Apr 24 14:24:40.479552 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:40.479122 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-78bb745fb6-x54tc" event={"ID":"e88e3b18-0568-4c3d-9672-99f01f7456b4","Type":"ContainerStarted","Data":"1b81644364e4934f405a2279c9d2c28860ab35a9505d1551c015b752eb9c4f15"} Apr 24 14:24:40.480570 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:40.480540 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8gtlg" event={"ID":"46136cc4-31fa-4d7b-a223-af0007d2fd5a","Type":"ContainerStarted","Data":"78add2524985e474b31433d6092a1675c06b7ea57da65ae0b4309b3e9bfc4d6f"} Apr 24 14:24:40.481998 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:40.481963 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cwc6x" event={"ID":"114b4da4-48aa-45ef-9304-25ef4821570d","Type":"ContainerStarted","Data":"6cbe5a902030446335155f61a3321c48177a81847452c9c6676e49edad5ea1b5"} Apr 24 14:24:40.488524 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:40.488444 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-jvth8" event={"ID":"0a1850b6-1bc3-4ff6-a567-7c2e8813a59d","Type":"ContainerStarted","Data":"e6023e239d61e38a45f93742ec7d2e03f63d2aed9a21672d80f2bd0095598092"} Apr 24 14:24:40.503284 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:40.503221 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-78bb745fb6-x54tc" podStartSLOduration=1.5032034250000001 podStartE2EDuration="1.503203425s" podCreationTimestamp="2026-04-24 14:24:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:24:40.50227942 +0000 UTC m=+71.092432143" watchObservedRunningTime="2026-04-24 14:24:40.503203425 +0000 UTC m=+71.093356147" Apr 24 14:24:40.833346 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:40.833307 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-78bb745fb6-x54tc" Apr 24 14:24:40.836634 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:40.836593 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-78bb745fb6-x54tc" Apr 24 14:24:41.491197 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:41.491158 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-78bb745fb6-x54tc" Apr 24 14:24:41.492493 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:41.492470 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-78bb745fb6-x54tc" Apr 24 14:24:42.495166 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:42.495115 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cwc6x" event={"ID":"114b4da4-48aa-45ef-9304-25ef4821570d","Type":"ContainerStarted","Data":"f5431c56659359643ed27df7fb67b3fc8f5f0ae3ec3a694d4376db56a43d3c45"} Apr 24 14:24:42.495166 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:42.495167 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cwc6x" event={"ID":"114b4da4-48aa-45ef-9304-25ef4821570d","Type":"ContainerStarted","Data":"1af500d06c9db27647ed71dc2a13adb4652eb2fb4e9c35718a029225e8b17435"} Apr 24 14:24:42.496560 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:42.496534 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-jvth8" event={"ID":"0a1850b6-1bc3-4ff6-a567-7c2e8813a59d","Type":"ContainerStarted","Data":"cf70392f64160efa55efc8ff187634f3674e8d0e4c7c497dbc57c6f5a25d9082"} Apr 24 14:24:42.497781 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:42.497757 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8gtlg" event={"ID":"46136cc4-31fa-4d7b-a223-af0007d2fd5a","Type":"ContainerStarted","Data":"b317af4e10537e9e461d6c7d07d5946c251255acf585f247fbfc2bfe768b24dd"} Apr 24 14:24:42.508938 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:42.508885 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cwc6x" podStartSLOduration=1.8844170660000001 podStartE2EDuration="3.508867694s" podCreationTimestamp="2026-04-24 14:24:39 +0000 UTC" firstStartedPulling="2026-04-24 14:24:40.202009992 +0000 UTC m=+70.792162692" lastFinishedPulling="2026-04-24 14:24:41.826460614 +0000 UTC m=+72.416613320" observedRunningTime="2026-04-24 14:24:42.508520311 +0000 UTC m=+73.098673033" watchObservedRunningTime="2026-04-24 14:24:42.508867694 +0000 UTC m=+73.099020420" Apr 24 14:24:42.522984 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:42.522934 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8gtlg" podStartSLOduration=1.677335354 podStartE2EDuration="3.52291923s" podCreationTimestamp="2026-04-24 14:24:39 +0000 UTC" firstStartedPulling="2026-04-24 14:24:39.978046076 +0000 UTC m=+70.568198790" lastFinishedPulling="2026-04-24 14:24:41.823629948 +0000 UTC m=+72.413782666" observedRunningTime="2026-04-24 14:24:42.521269368 +0000 UTC m=+73.111422091" watchObservedRunningTime="2026-04-24 14:24:42.52291923 +0000 UTC m=+73.113071951" Apr 24 14:24:42.540554 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:42.540499 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-jvth8" podStartSLOduration=1.55517341 podStartE2EDuration="3.540482052s" podCreationTimestamp="2026-04-24 14:24:39 +0000 UTC" firstStartedPulling="2026-04-24 14:24:39.889826417 +0000 UTC m=+70.479979119" lastFinishedPulling="2026-04-24 14:24:41.875135045 +0000 UTC m=+72.465287761" observedRunningTime="2026-04-24 14:24:42.539524418 +0000 UTC m=+73.129677139" watchObservedRunningTime="2026-04-24 14:24:42.540482052 +0000 UTC m=+73.130634774" Apr 24 14:24:42.708483 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:42.708447 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-7979n" Apr 24 14:24:42.708483 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:42.708483 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-7979n" Apr 24 14:24:42.708857 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:42.708843 2567 scope.go:117] "RemoveContainer" containerID="51b4bf202790d494b22691ed8303bd45ceee6af1077513b5af5336f20c53d163" Apr 24 14:24:42.709022 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:42.709007 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-7979n_openshift-console-operator(fee20684-697d-4937-9bf8-9549ab5442bf)\"" pod="openshift-console-operator/console-operator-9d4b6777b-7979n" podUID="fee20684-697d-4937-9bf8-9549ab5442bf" Apr 24 14:24:43.501399 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:43.501368 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8gtlg" Apr 24 14:24:43.506039 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:43.506008 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8gtlg" Apr 24 14:24:43.700155 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:43.700131 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qcxs6_743626cc-db6b-4ae5-a8dc-cceaa1cb8be0/dns/0.log" Apr 24 14:24:43.885026 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:43.884999 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qcxs6_743626cc-db6b-4ae5-a8dc-cceaa1cb8be0/kube-rbac-proxy/0.log" Apr 24 14:24:44.475504 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:44.475473 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-vc8xg"] Apr 24 14:24:44.490031 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:44.490007 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-p222m_fbd73a0c-457f-455c-b2ca-ef248d74efc8/dns-node-resolver/0.log" Apr 24 14:24:44.501018 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:44.500993 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-vc8xg"] Apr 24 14:24:44.501189 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:44.501120 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-vc8xg" Apr 24 14:24:44.504199 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:44.504173 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 24 14:24:44.504605 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:44.504178 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 14:24:44.504605 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:44.504178 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 24 14:24:44.504605 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:44.504245 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-q7w5v\"" Apr 24 14:24:44.658986 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:44.658949 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/23201e36-718e-44a5-a05a-1ff9a57ff20b-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-vc8xg\" (UID: \"23201e36-718e-44a5-a05a-1ff9a57ff20b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vc8xg" Apr 24 14:24:44.658986 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:44.658994 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/23201e36-718e-44a5-a05a-1ff9a57ff20b-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-vc8xg\" (UID: \"23201e36-718e-44a5-a05a-1ff9a57ff20b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vc8xg" Apr 24 14:24:44.659312 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:44.659128 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvzwn\" (UniqueName: \"kubernetes.io/projected/23201e36-718e-44a5-a05a-1ff9a57ff20b-kube-api-access-nvzwn\") pod \"prometheus-operator-5676c8c784-vc8xg\" (UID: \"23201e36-718e-44a5-a05a-1ff9a57ff20b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vc8xg" Apr 24 14:24:44.659312 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:44.659214 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/23201e36-718e-44a5-a05a-1ff9a57ff20b-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-vc8xg\" (UID: \"23201e36-718e-44a5-a05a-1ff9a57ff20b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vc8xg" Apr 24 14:24:44.692257 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:44.692215 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-55d6cb6cf9-tqwbb_cbc7c460-1a4a-47be-8d93-efd98ee46239/registry/0.log" Apr 24 14:24:44.760425 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:44.760349 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/23201e36-718e-44a5-a05a-1ff9a57ff20b-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-vc8xg\" (UID: \"23201e36-718e-44a5-a05a-1ff9a57ff20b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vc8xg" Apr 24 14:24:44.760425 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:44.760396 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/23201e36-718e-44a5-a05a-1ff9a57ff20b-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-vc8xg\" (UID: \"23201e36-718e-44a5-a05a-1ff9a57ff20b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vc8xg" Apr 24 14:24:44.760425 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:44.760426 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nvzwn\" (UniqueName: \"kubernetes.io/projected/23201e36-718e-44a5-a05a-1ff9a57ff20b-kube-api-access-nvzwn\") pod \"prometheus-operator-5676c8c784-vc8xg\" (UID: \"23201e36-718e-44a5-a05a-1ff9a57ff20b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vc8xg" Apr 24 14:24:44.760674 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:44.760465 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/23201e36-718e-44a5-a05a-1ff9a57ff20b-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-vc8xg\" (UID: \"23201e36-718e-44a5-a05a-1ff9a57ff20b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vc8xg" Apr 24 14:24:44.761061 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:44.761041 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/23201e36-718e-44a5-a05a-1ff9a57ff20b-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-vc8xg\" (UID: \"23201e36-718e-44a5-a05a-1ff9a57ff20b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vc8xg" Apr 24 14:24:44.762989 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:44.762961 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/23201e36-718e-44a5-a05a-1ff9a57ff20b-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-vc8xg\" (UID: \"23201e36-718e-44a5-a05a-1ff9a57ff20b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vc8xg" Apr 24 14:24:44.763079 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:44.763018 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/23201e36-718e-44a5-a05a-1ff9a57ff20b-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-vc8xg\" (UID: \"23201e36-718e-44a5-a05a-1ff9a57ff20b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vc8xg" Apr 24 14:24:44.767932 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:44.767910 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvzwn\" (UniqueName: \"kubernetes.io/projected/23201e36-718e-44a5-a05a-1ff9a57ff20b-kube-api-access-nvzwn\") pod \"prometheus-operator-5676c8c784-vc8xg\" (UID: \"23201e36-718e-44a5-a05a-1ff9a57ff20b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vc8xg" Apr 24 14:24:44.811081 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:44.811057 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-vc8xg" Apr 24 14:24:44.941827 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:44.941768 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-vc8xg"] Apr 24 14:24:44.944940 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:24:44.944893 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23201e36_718e_44a5_a05a_1ff9a57ff20b.slice/crio-8a48876c3435875105c782c7e133950453ebd34f45ba3546a74edb36d59e704e WatchSource:0}: Error finding container 8a48876c3435875105c782c7e133950453ebd34f45ba3546a74edb36d59e704e: Status 404 returned error can't find the container with id 8a48876c3435875105c782c7e133950453ebd34f45ba3546a74edb36d59e704e Apr 24 14:24:45.487424 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:45.487395 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-m4w28_2a44ddf6-7291-4385-a9e5-9e6dd777407e/node-ca/0.log" Apr 24 14:24:45.509408 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:45.509369 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-vc8xg" event={"ID":"23201e36-718e-44a5-a05a-1ff9a57ff20b","Type":"ContainerStarted","Data":"8a48876c3435875105c782c7e133950453ebd34f45ba3546a74edb36d59e704e"} Apr 24 14:24:45.685912 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:45.685878 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-78bb745fb6-x54tc_e88e3b18-0568-4c3d-9672-99f01f7456b4/router/0.log" Apr 24 14:24:45.885075 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:45.885039 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-95gsb_d834bc9a-bc43-42cc-82ed-5b3a77d4da5d/serve-healthcheck-canary/0.log" Apr 24 14:24:46.085799 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:46.085752 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-cwc6x_114b4da4-48aa-45ef-9304-25ef4821570d/migrator/0.log" Apr 24 14:24:46.284859 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:46.284777 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-cwc6x_114b4da4-48aa-45ef-9304-25ef4821570d/graceful-termination/0.log" Apr 24 14:24:46.486982 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:46.486957 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-28tw4_efd840e5-21ca-4b28-993c-7e8bd0b9a822/kube-storage-version-migrator-operator/0.log" Apr 24 14:24:47.204183 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:47.204143 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-fw95k"] Apr 24 14:24:47.230882 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:47.230833 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fw95k"] Apr 24 14:24:47.231018 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:47.230963 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fw95k" Apr 24 14:24:47.233128 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:47.233101 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 14:24:47.233128 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:47.233107 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-gwkgg\"" Apr 24 14:24:47.233786 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:47.233751 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 14:24:47.384424 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:47.384383 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/48a81750-849c-4b96-b2cb-1463543ee44e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fw95k\" (UID: \"48a81750-849c-4b96-b2cb-1463543ee44e\") " pod="openshift-insights/insights-runtime-extractor-fw95k" Apr 24 14:24:47.384424 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:47.384422 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztbz6\" (UniqueName: \"kubernetes.io/projected/48a81750-849c-4b96-b2cb-1463543ee44e-kube-api-access-ztbz6\") pod \"insights-runtime-extractor-fw95k\" (UID: \"48a81750-849c-4b96-b2cb-1463543ee44e\") " pod="openshift-insights/insights-runtime-extractor-fw95k" Apr 24 14:24:47.384631 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:47.384446 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/48a81750-849c-4b96-b2cb-1463543ee44e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fw95k\" (UID: \"48a81750-849c-4b96-b2cb-1463543ee44e\") " pod="openshift-insights/insights-runtime-extractor-fw95k" Apr 24 14:24:47.384631 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:47.384525 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/48a81750-849c-4b96-b2cb-1463543ee44e-crio-socket\") pod \"insights-runtime-extractor-fw95k\" (UID: \"48a81750-849c-4b96-b2cb-1463543ee44e\") " pod="openshift-insights/insights-runtime-extractor-fw95k" Apr 24 14:24:47.384631 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:47.384562 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/48a81750-849c-4b96-b2cb-1463543ee44e-data-volume\") pod \"insights-runtime-extractor-fw95k\" (UID: \"48a81750-849c-4b96-b2cb-1463543ee44e\") " pod="openshift-insights/insights-runtime-extractor-fw95k" Apr 24 14:24:47.485588 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:47.485498 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/48a81750-849c-4b96-b2cb-1463543ee44e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fw95k\" (UID: \"48a81750-849c-4b96-b2cb-1463543ee44e\") " pod="openshift-insights/insights-runtime-extractor-fw95k" Apr 24 14:24:47.485588 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:47.485541 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztbz6\" (UniqueName: \"kubernetes.io/projected/48a81750-849c-4b96-b2cb-1463543ee44e-kube-api-access-ztbz6\") pod \"insights-runtime-extractor-fw95k\" (UID: \"48a81750-849c-4b96-b2cb-1463543ee44e\") " pod="openshift-insights/insights-runtime-extractor-fw95k" Apr 24 14:24:47.485588 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:47.485563 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/48a81750-849c-4b96-b2cb-1463543ee44e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fw95k\" (UID: \"48a81750-849c-4b96-b2cb-1463543ee44e\") " pod="openshift-insights/insights-runtime-extractor-fw95k" Apr 24 14:24:47.485856 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:47.485739 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/48a81750-849c-4b96-b2cb-1463543ee44e-crio-socket\") pod \"insights-runtime-extractor-fw95k\" (UID: \"48a81750-849c-4b96-b2cb-1463543ee44e\") " pod="openshift-insights/insights-runtime-extractor-fw95k" Apr 24 14:24:47.485856 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:47.485795 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/48a81750-849c-4b96-b2cb-1463543ee44e-data-volume\") pod \"insights-runtime-extractor-fw95k\" (UID: \"48a81750-849c-4b96-b2cb-1463543ee44e\") " pod="openshift-insights/insights-runtime-extractor-fw95k" Apr 24 14:24:47.485960 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:47.485876 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/48a81750-849c-4b96-b2cb-1463543ee44e-crio-socket\") pod \"insights-runtime-extractor-fw95k\" (UID: \"48a81750-849c-4b96-b2cb-1463543ee44e\") " pod="openshift-insights/insights-runtime-extractor-fw95k" Apr 24 14:24:47.486192 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:47.486169 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/48a81750-849c-4b96-b2cb-1463543ee44e-data-volume\") pod \"insights-runtime-extractor-fw95k\" (UID: \"48a81750-849c-4b96-b2cb-1463543ee44e\") " pod="openshift-insights/insights-runtime-extractor-fw95k" Apr 24 14:24:47.486322 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:47.486307 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/48a81750-849c-4b96-b2cb-1463543ee44e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fw95k\" (UID: \"48a81750-849c-4b96-b2cb-1463543ee44e\") " pod="openshift-insights/insights-runtime-extractor-fw95k" Apr 24 14:24:47.487980 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:47.487959 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/48a81750-849c-4b96-b2cb-1463543ee44e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fw95k\" (UID: \"48a81750-849c-4b96-b2cb-1463543ee44e\") " pod="openshift-insights/insights-runtime-extractor-fw95k" Apr 24 14:24:47.492754 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:47.492733 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztbz6\" (UniqueName: \"kubernetes.io/projected/48a81750-849c-4b96-b2cb-1463543ee44e-kube-api-access-ztbz6\") pod \"insights-runtime-extractor-fw95k\" (UID: \"48a81750-849c-4b96-b2cb-1463543ee44e\") " pod="openshift-insights/insights-runtime-extractor-fw95k" Apr 24 14:24:47.516891 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:47.516852 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-vc8xg" event={"ID":"23201e36-718e-44a5-a05a-1ff9a57ff20b","Type":"ContainerStarted","Data":"dc0f395150f435170399d386829d06836fcb0f9c3d8227093afc964cd7f614cc"} Apr 24 14:24:47.516891 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:47.516894 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-vc8xg" event={"ID":"23201e36-718e-44a5-a05a-1ff9a57ff20b","Type":"ContainerStarted","Data":"a47bfb590944965cf7635dcddb152c55ce9ae9acd728f284cc5a0bdfed35a1d2"} Apr 24 14:24:47.532327 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:47.532274 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-vc8xg" podStartSLOduration=1.888612015 podStartE2EDuration="3.53223439s" podCreationTimestamp="2026-04-24 14:24:44 +0000 UTC" firstStartedPulling="2026-04-24 14:24:44.94699495 +0000 UTC m=+75.537147651" lastFinishedPulling="2026-04-24 14:24:46.590617313 +0000 UTC m=+77.180770026" observedRunningTime="2026-04-24 14:24:47.531095173 +0000 UTC m=+78.121247894" watchObservedRunningTime="2026-04-24 14:24:47.53223439 +0000 UTC m=+78.122387112" Apr 24 14:24:47.540496 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:47.540466 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fw95k" Apr 24 14:24:47.665739 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:47.665437 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fw95k"] Apr 24 14:24:47.668288 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:24:47.668261 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48a81750_849c_4b96_b2cb_1463543ee44e.slice/crio-5af1cb1e9cdc76da951472cfcaaf2ce86d9acb50486b8e9f7e5bbd351b80ca09 WatchSource:0}: Error finding container 5af1cb1e9cdc76da951472cfcaaf2ce86d9acb50486b8e9f7e5bbd351b80ca09: Status 404 returned error can't find the container with id 5af1cb1e9cdc76da951472cfcaaf2ce86d9acb50486b8e9f7e5bbd351b80ca09 Apr 24 14:24:48.371042 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:48.371010 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-57rkt" Apr 24 14:24:48.521991 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:48.521952 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fw95k" event={"ID":"48a81750-849c-4b96-b2cb-1463543ee44e","Type":"ContainerStarted","Data":"8e690abbb53e3a7c88adc9e68f5792e6010d0e0608999c0a3b6a4509d6f30079"} Apr 24 14:24:48.522153 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:48.522001 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fw95k" event={"ID":"48a81750-849c-4b96-b2cb-1463543ee44e","Type":"ContainerStarted","Data":"5af1cb1e9cdc76da951472cfcaaf2ce86d9acb50486b8e9f7e5bbd351b80ca09"} Apr 24 14:24:49.490752 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:49.490720 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-qcxs6" Apr 24 14:24:49.532936 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:49.532892 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fw95k" event={"ID":"48a81750-849c-4b96-b2cb-1463543ee44e","Type":"ContainerStarted","Data":"140d64ca134a17ebdb19ad258930a4c9ed325b376c14b954994e15ce257ad798"} Apr 24 14:24:49.883029 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:49.882547 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-fx47g"] Apr 24 14:24:49.899962 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:49.899932 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-fx47g" Apr 24 14:24:49.902225 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:49.902204 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 14:24:49.902365 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:49.902233 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-72grf\"" Apr 24 14:24:49.902365 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:49.902267 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 14:24:49.902624 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:49.902609 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 14:24:50.008535 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:50.008472 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/379f1218-5ffe-4cc4-a3fe-be26d209b35b-metrics-client-ca\") pod \"node-exporter-fx47g\" (UID: \"379f1218-5ffe-4cc4-a3fe-be26d209b35b\") " pod="openshift-monitoring/node-exporter-fx47g" Apr 24 14:24:50.008535 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:50.008522 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/379f1218-5ffe-4cc4-a3fe-be26d209b35b-node-exporter-tls\") pod \"node-exporter-fx47g\" (UID: \"379f1218-5ffe-4cc4-a3fe-be26d209b35b\") " pod="openshift-monitoring/node-exporter-fx47g" Apr 24 14:24:50.009087 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:50.008552 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/379f1218-5ffe-4cc4-a3fe-be26d209b35b-node-exporter-wtmp\") pod \"node-exporter-fx47g\" (UID: \"379f1218-5ffe-4cc4-a3fe-be26d209b35b\") " pod="openshift-monitoring/node-exporter-fx47g" Apr 24 14:24:50.009087 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:50.008579 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/379f1218-5ffe-4cc4-a3fe-be26d209b35b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-fx47g\" (UID: \"379f1218-5ffe-4cc4-a3fe-be26d209b35b\") " pod="openshift-monitoring/node-exporter-fx47g" Apr 24 14:24:50.009087 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:50.008634 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hb6s\" (UniqueName: \"kubernetes.io/projected/379f1218-5ffe-4cc4-a3fe-be26d209b35b-kube-api-access-4hb6s\") pod \"node-exporter-fx47g\" (UID: \"379f1218-5ffe-4cc4-a3fe-be26d209b35b\") " pod="openshift-monitoring/node-exporter-fx47g" Apr 24 14:24:50.009087 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:50.008718 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/379f1218-5ffe-4cc4-a3fe-be26d209b35b-sys\") pod \"node-exporter-fx47g\" (UID: \"379f1218-5ffe-4cc4-a3fe-be26d209b35b\") " pod="openshift-monitoring/node-exporter-fx47g" Apr 24 14:24:50.009087 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:50.008741 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/379f1218-5ffe-4cc4-a3fe-be26d209b35b-root\") pod \"node-exporter-fx47g\" (UID: \"379f1218-5ffe-4cc4-a3fe-be26d209b35b\") " pod="openshift-monitoring/node-exporter-fx47g" Apr 24 14:24:50.009087 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:50.008834 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/379f1218-5ffe-4cc4-a3fe-be26d209b35b-node-exporter-accelerators-collector-config\") pod \"node-exporter-fx47g\" (UID: \"379f1218-5ffe-4cc4-a3fe-be26d209b35b\") " pod="openshift-monitoring/node-exporter-fx47g" Apr 24 14:24:50.009087 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:50.008895 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/379f1218-5ffe-4cc4-a3fe-be26d209b35b-node-exporter-textfile\") pod \"node-exporter-fx47g\" (UID: \"379f1218-5ffe-4cc4-a3fe-be26d209b35b\") " pod="openshift-monitoring/node-exporter-fx47g" Apr 24 14:24:50.109776 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:50.109739 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/379f1218-5ffe-4cc4-a3fe-be26d209b35b-node-exporter-accelerators-collector-config\") pod \"node-exporter-fx47g\" (UID: \"379f1218-5ffe-4cc4-a3fe-be26d209b35b\") " pod="openshift-monitoring/node-exporter-fx47g" Apr 24 14:24:50.109965 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:50.109839 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/379f1218-5ffe-4cc4-a3fe-be26d209b35b-node-exporter-textfile\") pod \"node-exporter-fx47g\" (UID: \"379f1218-5ffe-4cc4-a3fe-be26d209b35b\") " pod="openshift-monitoring/node-exporter-fx47g" Apr 24 14:24:50.109965 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:50.109892 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/379f1218-5ffe-4cc4-a3fe-be26d209b35b-metrics-client-ca\") pod \"node-exporter-fx47g\" (UID: \"379f1218-5ffe-4cc4-a3fe-be26d209b35b\") " pod="openshift-monitoring/node-exporter-fx47g" Apr 24 14:24:50.109965 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:50.109927 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/379f1218-5ffe-4cc4-a3fe-be26d209b35b-node-exporter-tls\") pod \"node-exporter-fx47g\" (UID: \"379f1218-5ffe-4cc4-a3fe-be26d209b35b\") " pod="openshift-monitoring/node-exporter-fx47g" Apr 24 14:24:50.109965 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:50.109956 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/379f1218-5ffe-4cc4-a3fe-be26d209b35b-node-exporter-wtmp\") pod \"node-exporter-fx47g\" (UID: \"379f1218-5ffe-4cc4-a3fe-be26d209b35b\") " pod="openshift-monitoring/node-exporter-fx47g" Apr 24 14:24:50.110166 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:50.109982 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/379f1218-5ffe-4cc4-a3fe-be26d209b35b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-fx47g\" (UID: \"379f1218-5ffe-4cc4-a3fe-be26d209b35b\") " pod="openshift-monitoring/node-exporter-fx47g" Apr 24 14:24:50.110166 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:50.110005 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hb6s\" (UniqueName: \"kubernetes.io/projected/379f1218-5ffe-4cc4-a3fe-be26d209b35b-kube-api-access-4hb6s\") pod \"node-exporter-fx47g\" (UID: \"379f1218-5ffe-4cc4-a3fe-be26d209b35b\") " pod="openshift-monitoring/node-exporter-fx47g" Apr 24 14:24:50.110166 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:50.110036 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/379f1218-5ffe-4cc4-a3fe-be26d209b35b-sys\") pod \"node-exporter-fx47g\" (UID: \"379f1218-5ffe-4cc4-a3fe-be26d209b35b\") " pod="openshift-monitoring/node-exporter-fx47g" Apr 24 14:24:50.110166 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:50.110060 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/379f1218-5ffe-4cc4-a3fe-be26d209b35b-root\") pod \"node-exporter-fx47g\" (UID: \"379f1218-5ffe-4cc4-a3fe-be26d209b35b\") " pod="openshift-monitoring/node-exporter-fx47g" Apr 24 14:24:50.110166 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:50.110159 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/379f1218-5ffe-4cc4-a3fe-be26d209b35b-root\") pod \"node-exporter-fx47g\" (UID: \"379f1218-5ffe-4cc4-a3fe-be26d209b35b\") " pod="openshift-monitoring/node-exporter-fx47g" Apr 24 14:24:50.110448 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:50.110205 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/379f1218-5ffe-4cc4-a3fe-be26d209b35b-node-exporter-textfile\") pod \"node-exporter-fx47g\" (UID: \"379f1218-5ffe-4cc4-a3fe-be26d209b35b\") " pod="openshift-monitoring/node-exporter-fx47g" Apr 24 14:24:50.110448 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:50.110354 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/379f1218-5ffe-4cc4-a3fe-be26d209b35b-node-exporter-accelerators-collector-config\") pod \"node-exporter-fx47g\" (UID: \"379f1218-5ffe-4cc4-a3fe-be26d209b35b\") " pod="openshift-monitoring/node-exporter-fx47g" Apr 24 14:24:50.110448 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:50.110410 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/379f1218-5ffe-4cc4-a3fe-be26d209b35b-sys\") pod \"node-exporter-fx47g\" (UID: \"379f1218-5ffe-4cc4-a3fe-be26d209b35b\") " pod="openshift-monitoring/node-exporter-fx47g" Apr 24 14:24:50.110603 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:50.110540 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/379f1218-5ffe-4cc4-a3fe-be26d209b35b-node-exporter-wtmp\") pod \"node-exporter-fx47g\" (UID: \"379f1218-5ffe-4cc4-a3fe-be26d209b35b\") " pod="openshift-monitoring/node-exporter-fx47g" Apr 24 14:24:50.110657 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:50.110621 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/379f1218-5ffe-4cc4-a3fe-be26d209b35b-metrics-client-ca\") pod \"node-exporter-fx47g\" (UID: \"379f1218-5ffe-4cc4-a3fe-be26d209b35b\") " pod="openshift-monitoring/node-exporter-fx47g" Apr 24 14:24:50.113146 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:50.113123 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/379f1218-5ffe-4cc4-a3fe-be26d209b35b-node-exporter-tls\") pod \"node-exporter-fx47g\" (UID: \"379f1218-5ffe-4cc4-a3fe-be26d209b35b\") " pod="openshift-monitoring/node-exporter-fx47g" Apr 24 14:24:50.113426 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:50.113402 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/379f1218-5ffe-4cc4-a3fe-be26d209b35b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-fx47g\" (UID: \"379f1218-5ffe-4cc4-a3fe-be26d209b35b\") " pod="openshift-monitoring/node-exporter-fx47g" Apr 24 14:24:50.119004 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:50.118982 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hb6s\" (UniqueName: \"kubernetes.io/projected/379f1218-5ffe-4cc4-a3fe-be26d209b35b-kube-api-access-4hb6s\") pod \"node-exporter-fx47g\" (UID: \"379f1218-5ffe-4cc4-a3fe-be26d209b35b\") " pod="openshift-monitoring/node-exporter-fx47g" Apr 24 14:24:50.210713 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:50.210633 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-fx47g" Apr 24 14:24:50.222304 ip-10-0-128-36 kubenswrapper[2567]: W0424 14:24:50.222243 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod379f1218_5ffe_4cc4_a3fe_be26d209b35b.slice/crio-d382b90d35ebaf7285467cea5664b590658e0bae1e0d515f733ceaa9061feb3c WatchSource:0}: Error finding container d382b90d35ebaf7285467cea5664b590658e0bae1e0d515f733ceaa9061feb3c: Status 404 returned error can't find the container with id d382b90d35ebaf7285467cea5664b590658e0bae1e0d515f733ceaa9061feb3c Apr 24 14:24:50.537458 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:50.537379 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fx47g" event={"ID":"379f1218-5ffe-4cc4-a3fe-be26d209b35b","Type":"ContainerStarted","Data":"d382b90d35ebaf7285467cea5664b590658e0bae1e0d515f733ceaa9061feb3c"} Apr 24 14:24:52.546418 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:52.546329 2567 generic.go:358] "Generic (PLEG): container finished" podID="379f1218-5ffe-4cc4-a3fe-be26d209b35b" containerID="ff372c722e95428d9ac3347d6e41ea991fb3331e0c5fb0d69d4a602a238d27ae" exitCode=0 Apr 24 14:24:52.546865 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:52.546423 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fx47g" event={"ID":"379f1218-5ffe-4cc4-a3fe-be26d209b35b","Type":"ContainerDied","Data":"ff372c722e95428d9ac3347d6e41ea991fb3331e0c5fb0d69d4a602a238d27ae"} Apr 24 14:24:52.548536 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:52.548511 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fw95k" event={"ID":"48a81750-849c-4b96-b2cb-1463543ee44e","Type":"ContainerStarted","Data":"f29f844650e747c9e8ff2389e35a2a57287b7aa0d150909c825fc8ce87e07876"} Apr 24 14:24:52.578273 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:52.578221 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-fw95k" podStartSLOduration=1.696106586 podStartE2EDuration="5.578204054s" podCreationTimestamp="2026-04-24 14:24:47 +0000 UTC" firstStartedPulling="2026-04-24 14:24:47.772760965 +0000 UTC m=+78.362913666" lastFinishedPulling="2026-04-24 14:24:51.654858425 +0000 UTC m=+82.245011134" observedRunningTime="2026-04-24 14:24:52.577035722 +0000 UTC m=+83.167188443" watchObservedRunningTime="2026-04-24 14:24:52.578204054 +0000 UTC m=+83.168356776" Apr 24 14:24:53.553497 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:53.553457 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fx47g" event={"ID":"379f1218-5ffe-4cc4-a3fe-be26d209b35b","Type":"ContainerStarted","Data":"fea17e2c4209ec72e1c6577a8cfa66361246522ceccca828dab23bc51b9490b6"} Apr 24 14:24:53.553497 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:53.553498 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fx47g" event={"ID":"379f1218-5ffe-4cc4-a3fe-be26d209b35b","Type":"ContainerStarted","Data":"335df536d6465885866d8a279dc75ddfa84fe7d086e4e718a505a04f821449cb"} Apr 24 14:24:53.573356 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:53.573304 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-fx47g" podStartSLOduration=2.805230568 podStartE2EDuration="4.573289665s" podCreationTimestamp="2026-04-24 14:24:49 +0000 UTC" firstStartedPulling="2026-04-24 14:24:50.224742621 +0000 UTC m=+80.814895326" lastFinishedPulling="2026-04-24 14:24:51.99280172 +0000 UTC m=+82.582954423" observedRunningTime="2026-04-24 14:24:53.571688134 +0000 UTC m=+84.161840855" watchObservedRunningTime="2026-04-24 14:24:53.573289665 +0000 UTC m=+84.163442432" Apr 24 14:24:54.018867 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:54.018787 2567 scope.go:117] "RemoveContainer" containerID="51b4bf202790d494b22691ed8303bd45ceee6af1077513b5af5336f20c53d163" Apr 24 14:24:54.019125 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:24:54.019099 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-7979n_openshift-console-operator(fee20684-697d-4937-9bf8-9549ab5442bf)\"" pod="openshift-console-operator/console-operator-9d4b6777b-7979n" podUID="fee20684-697d-4937-9bf8-9549ab5442bf" Apr 24 14:24:54.470105 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:54.470067 2567 patch_prober.go:28] interesting pod/image-registry-55d6cb6cf9-tqwbb container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 14:24:54.470265 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:54.470131 2567 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" podUID="cbc7c460-1a4a-47be-8d93-efd98ee46239" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:24:56.458844 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:24:56.458800 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" Apr 24 14:25:07.418950 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:07.418918 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-55d6cb6cf9-tqwbb"] Apr 24 14:25:08.019562 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:08.019529 2567 scope.go:117] "RemoveContainer" containerID="51b4bf202790d494b22691ed8303bd45ceee6af1077513b5af5336f20c53d163" Apr 24 14:25:08.601221 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:08.601194 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7979n_fee20684-697d-4937-9bf8-9549ab5442bf/console-operator/2.log" Apr 24 14:25:08.601572 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:08.601243 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-7979n" event={"ID":"fee20684-697d-4937-9bf8-9549ab5442bf","Type":"ContainerStarted","Data":"fedcdce833e10c72a506203bab50648c2371c950f4144facfa3fe35ffbdbdaf4"} Apr 24 14:25:08.601572 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:08.601498 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-7979n" Apr 24 14:25:08.606156 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:08.606135 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-7979n" Apr 24 14:25:08.617528 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:08.617488 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-7979n" podStartSLOduration=82.214788011 podStartE2EDuration="1m35.617475166s" podCreationTimestamp="2026-04-24 14:23:33 +0000 UTC" firstStartedPulling="2026-04-24 14:24:03.17466341 +0000 UTC m=+33.764816116" lastFinishedPulling="2026-04-24 14:24:16.57735057 +0000 UTC m=+47.167503271" observedRunningTime="2026-04-24 14:25:08.61615107 +0000 UTC m=+99.206303791" watchObservedRunningTime="2026-04-24 14:25:08.617475166 +0000 UTC m=+99.207627888" Apr 24 14:25:32.437736 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:32.437669 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" podUID="cbc7c460-1a4a-47be-8d93-efd98ee46239" containerName="registry" containerID="cri-o://e1ef6b5cf29a66cf9ccdb5ac8e912e73dd9acb1c611d2f42a51c0dafd9c2702e" gracePeriod=30 Apr 24 14:25:32.657625 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:32.657604 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" Apr 24 14:25:32.671188 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:32.671162 2567 generic.go:358] "Generic (PLEG): container finished" podID="cbc7c460-1a4a-47be-8d93-efd98ee46239" containerID="e1ef6b5cf29a66cf9ccdb5ac8e912e73dd9acb1c611d2f42a51c0dafd9c2702e" exitCode=0 Apr 24 14:25:32.671315 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:32.671215 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" Apr 24 14:25:32.671315 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:32.671243 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" event={"ID":"cbc7c460-1a4a-47be-8d93-efd98ee46239","Type":"ContainerDied","Data":"e1ef6b5cf29a66cf9ccdb5ac8e912e73dd9acb1c611d2f42a51c0dafd9c2702e"} Apr 24 14:25:32.671315 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:32.671287 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55d6cb6cf9-tqwbb" event={"ID":"cbc7c460-1a4a-47be-8d93-efd98ee46239","Type":"ContainerDied","Data":"24e989e8b2085a04478fb15aba8aabbb25491206a82c0f87203926dddf34b6ef"} Apr 24 14:25:32.671315 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:32.671306 2567 scope.go:117] "RemoveContainer" containerID="e1ef6b5cf29a66cf9ccdb5ac8e912e73dd9acb1c611d2f42a51c0dafd9c2702e" Apr 24 14:25:32.679385 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:32.679234 2567 scope.go:117] "RemoveContainer" containerID="e1ef6b5cf29a66cf9ccdb5ac8e912e73dd9acb1c611d2f42a51c0dafd9c2702e" Apr 24 14:25:32.679514 ip-10-0-128-36 kubenswrapper[2567]: E0424 14:25:32.679493 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1ef6b5cf29a66cf9ccdb5ac8e912e73dd9acb1c611d2f42a51c0dafd9c2702e\": container with ID starting with e1ef6b5cf29a66cf9ccdb5ac8e912e73dd9acb1c611d2f42a51c0dafd9c2702e not found: ID does not exist" containerID="e1ef6b5cf29a66cf9ccdb5ac8e912e73dd9acb1c611d2f42a51c0dafd9c2702e" Apr 24 14:25:32.679573 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:32.679527 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1ef6b5cf29a66cf9ccdb5ac8e912e73dd9acb1c611d2f42a51c0dafd9c2702e"} err="failed to get container status \"e1ef6b5cf29a66cf9ccdb5ac8e912e73dd9acb1c611d2f42a51c0dafd9c2702e\": rpc error: code = NotFound desc = could not find container \"e1ef6b5cf29a66cf9ccdb5ac8e912e73dd9acb1c611d2f42a51c0dafd9c2702e\": container with ID starting with e1ef6b5cf29a66cf9ccdb5ac8e912e73dd9acb1c611d2f42a51c0dafd9c2702e not found: ID does not exist" Apr 24 14:25:32.768418 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:32.768346 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cbc7c460-1a4a-47be-8d93-efd98ee46239-trusted-ca\") pod \"cbc7c460-1a4a-47be-8d93-efd98ee46239\" (UID: \"cbc7c460-1a4a-47be-8d93-efd98ee46239\") " Apr 24 14:25:32.768418 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:32.768385 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cbc7c460-1a4a-47be-8d93-efd98ee46239-registry-tls\") pod \"cbc7c460-1a4a-47be-8d93-efd98ee46239\" (UID: \"cbc7c460-1a4a-47be-8d93-efd98ee46239\") " Apr 24 14:25:32.768418 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:32.768403 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xv69l\" (UniqueName: \"kubernetes.io/projected/cbc7c460-1a4a-47be-8d93-efd98ee46239-kube-api-access-xv69l\") pod \"cbc7c460-1a4a-47be-8d93-efd98ee46239\" (UID: \"cbc7c460-1a4a-47be-8d93-efd98ee46239\") " Apr 24 14:25:32.768621 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:32.768454 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cbc7c460-1a4a-47be-8d93-efd98ee46239-ca-trust-extracted\") pod \"cbc7c460-1a4a-47be-8d93-efd98ee46239\" (UID: \"cbc7c460-1a4a-47be-8d93-efd98ee46239\") " Apr 24 14:25:32.768621 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:32.768489 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cbc7c460-1a4a-47be-8d93-efd98ee46239-image-registry-private-configuration\") pod \"cbc7c460-1a4a-47be-8d93-efd98ee46239\" (UID: \"cbc7c460-1a4a-47be-8d93-efd98ee46239\") " Apr 24 14:25:32.768621 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:32.768535 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cbc7c460-1a4a-47be-8d93-efd98ee46239-bound-sa-token\") pod \"cbc7c460-1a4a-47be-8d93-efd98ee46239\" (UID: \"cbc7c460-1a4a-47be-8d93-efd98ee46239\") " Apr 24 14:25:32.768621 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:32.768556 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cbc7c460-1a4a-47be-8d93-efd98ee46239-installation-pull-secrets\") pod \"cbc7c460-1a4a-47be-8d93-efd98ee46239\" (UID: \"cbc7c460-1a4a-47be-8d93-efd98ee46239\") " Apr 24 14:25:32.768621 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:32.768580 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cbc7c460-1a4a-47be-8d93-efd98ee46239-registry-certificates\") pod \"cbc7c460-1a4a-47be-8d93-efd98ee46239\" (UID: \"cbc7c460-1a4a-47be-8d93-efd98ee46239\") " Apr 24 14:25:32.768893 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:32.768792 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbc7c460-1a4a-47be-8d93-efd98ee46239-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "cbc7c460-1a4a-47be-8d93-efd98ee46239" (UID: "cbc7c460-1a4a-47be-8d93-efd98ee46239"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:25:32.769215 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:32.769184 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbc7c460-1a4a-47be-8d93-efd98ee46239-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "cbc7c460-1a4a-47be-8d93-efd98ee46239" (UID: "cbc7c460-1a4a-47be-8d93-efd98ee46239"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:25:32.771126 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:32.771088 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc7c460-1a4a-47be-8d93-efd98ee46239-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "cbc7c460-1a4a-47be-8d93-efd98ee46239" (UID: "cbc7c460-1a4a-47be-8d93-efd98ee46239"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:25:32.771230 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:32.771148 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbc7c460-1a4a-47be-8d93-efd98ee46239-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "cbc7c460-1a4a-47be-8d93-efd98ee46239" (UID: "cbc7c460-1a4a-47be-8d93-efd98ee46239"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:25:32.771230 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:32.771180 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbc7c460-1a4a-47be-8d93-efd98ee46239-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "cbc7c460-1a4a-47be-8d93-efd98ee46239" (UID: "cbc7c460-1a4a-47be-8d93-efd98ee46239"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:25:32.771230 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:32.771186 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc7c460-1a4a-47be-8d93-efd98ee46239-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "cbc7c460-1a4a-47be-8d93-efd98ee46239" (UID: "cbc7c460-1a4a-47be-8d93-efd98ee46239"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:25:32.771385 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:32.771280 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbc7c460-1a4a-47be-8d93-efd98ee46239-kube-api-access-xv69l" (OuterVolumeSpecName: "kube-api-access-xv69l") pod "cbc7c460-1a4a-47be-8d93-efd98ee46239" (UID: "cbc7c460-1a4a-47be-8d93-efd98ee46239"). InnerVolumeSpecName "kube-api-access-xv69l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:25:32.776926 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:32.776899 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbc7c460-1a4a-47be-8d93-efd98ee46239-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "cbc7c460-1a4a-47be-8d93-efd98ee46239" (UID: "cbc7c460-1a4a-47be-8d93-efd98ee46239"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:25:32.822558 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:32.822524 2567 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79d47f7669-4ff6w" podUID="84bc3c16-3cf7-443f-887d-67eaa7bb7631" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 14:25:32.869852 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:32.869801 2567 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cbc7c460-1a4a-47be-8d93-efd98ee46239-image-registry-private-configuration\") on node \"ip-10-0-128-36.ec2.internal\" DevicePath \"\"" Apr 24 14:25:32.869852 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:32.869852 2567 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cbc7c460-1a4a-47be-8d93-efd98ee46239-bound-sa-token\") on node \"ip-10-0-128-36.ec2.internal\" DevicePath \"\"" Apr 24 14:25:32.870006 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:32.869868 2567 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cbc7c460-1a4a-47be-8d93-efd98ee46239-installation-pull-secrets\") on node \"ip-10-0-128-36.ec2.internal\" DevicePath \"\"" Apr 24 14:25:32.870006 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:32.869881 2567 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cbc7c460-1a4a-47be-8d93-efd98ee46239-registry-certificates\") on node \"ip-10-0-128-36.ec2.internal\" DevicePath \"\"" Apr 24 14:25:32.870006 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:32.869893 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cbc7c460-1a4a-47be-8d93-efd98ee46239-trusted-ca\") on node \"ip-10-0-128-36.ec2.internal\" DevicePath \"\"" Apr 24 14:25:32.870006 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:32.869904 2567 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cbc7c460-1a4a-47be-8d93-efd98ee46239-registry-tls\") on node \"ip-10-0-128-36.ec2.internal\" DevicePath \"\"" Apr 24 14:25:32.870006 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:32.869915 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xv69l\" (UniqueName: \"kubernetes.io/projected/cbc7c460-1a4a-47be-8d93-efd98ee46239-kube-api-access-xv69l\") on node \"ip-10-0-128-36.ec2.internal\" DevicePath \"\"" Apr 24 14:25:32.870006 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:32.869927 2567 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cbc7c460-1a4a-47be-8d93-efd98ee46239-ca-trust-extracted\") on node \"ip-10-0-128-36.ec2.internal\" DevicePath \"\"" Apr 24 14:25:32.994301 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:32.994271 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-55d6cb6cf9-tqwbb"] Apr 24 14:25:32.997383 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:32.997364 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-55d6cb6cf9-tqwbb"] Apr 24 14:25:33.676428 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:33.676389 2567 generic.go:358] "Generic (PLEG): container finished" podID="efd840e5-21ca-4b28-993c-7e8bd0b9a822" containerID="5d6707f2864c35caacd186670de659955c30698fdf180c3acda28223c01da4cd" exitCode=0 Apr 24 14:25:33.676861 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:33.676462 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-28tw4" event={"ID":"efd840e5-21ca-4b28-993c-7e8bd0b9a822","Type":"ContainerDied","Data":"5d6707f2864c35caacd186670de659955c30698fdf180c3acda28223c01da4cd"} Apr 24 14:25:33.676861 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:33.676766 2567 scope.go:117] "RemoveContainer" containerID="5d6707f2864c35caacd186670de659955c30698fdf180c3acda28223c01da4cd" Apr 24 14:25:34.022984 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:34.022898 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbc7c460-1a4a-47be-8d93-efd98ee46239" path="/var/lib/kubelet/pods/cbc7c460-1a4a-47be-8d93-efd98ee46239/volumes" Apr 24 14:25:34.681404 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:34.681370 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-28tw4" event={"ID":"efd840e5-21ca-4b28-993c-7e8bd0b9a822","Type":"ContainerStarted","Data":"f43c9fc7ff71f068531a67c965156b18a1c93c2f86ee940386dc777b584b6051"} Apr 24 14:25:37.693777 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:37.693736 2567 generic.go:358] "Generic (PLEG): container finished" podID="447bb5cc-c01c-4312-b625-551aa4e765b3" containerID="2f2e98d893b95f0d1935361cc07236304c5ecfee2141eb550fd4b5ecf9fa861a" exitCode=0 Apr 24 14:25:37.694175 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:37.693829 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qtdqf" event={"ID":"447bb5cc-c01c-4312-b625-551aa4e765b3","Type":"ContainerDied","Data":"2f2e98d893b95f0d1935361cc07236304c5ecfee2141eb550fd4b5ecf9fa861a"} Apr 24 14:25:37.694175 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:37.694134 2567 scope.go:117] "RemoveContainer" containerID="2f2e98d893b95f0d1935361cc07236304c5ecfee2141eb550fd4b5ecf9fa861a" Apr 24 14:25:38.698508 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:38.698471 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qtdqf" event={"ID":"447bb5cc-c01c-4312-b625-551aa4e765b3","Type":"ContainerStarted","Data":"9b34e708bcec97f59bbfbe72e0c2b0035c4e96f5757bc5f61c913b5a90a68b40"} Apr 24 14:25:42.822421 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:42.822376 2567 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79d47f7669-4ff6w" podUID="84bc3c16-3cf7-443f-887d-67eaa7bb7631" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 14:25:52.822799 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:52.822741 2567 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79d47f7669-4ff6w" podUID="84bc3c16-3cf7-443f-887d-67eaa7bb7631" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 14:25:52.823193 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:52.822868 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79d47f7669-4ff6w" Apr 24 14:25:52.823352 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:52.823335 2567 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"ca8301556e9be7816cf2ab3b558fdcd3437abe719de9250745085bf529516ae8"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79d47f7669-4ff6w" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 24 14:25:52.823391 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:52.823370 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79d47f7669-4ff6w" podUID="84bc3c16-3cf7-443f-887d-67eaa7bb7631" containerName="service-proxy" containerID="cri-o://ca8301556e9be7816cf2ab3b558fdcd3437abe719de9250745085bf529516ae8" gracePeriod=30 Apr 24 14:25:53.746363 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:53.746325 2567 generic.go:358] "Generic (PLEG): container finished" podID="84bc3c16-3cf7-443f-887d-67eaa7bb7631" containerID="ca8301556e9be7816cf2ab3b558fdcd3437abe719de9250745085bf529516ae8" exitCode=2 Apr 24 14:25:53.746527 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:53.746392 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79d47f7669-4ff6w" event={"ID":"84bc3c16-3cf7-443f-887d-67eaa7bb7631","Type":"ContainerDied","Data":"ca8301556e9be7816cf2ab3b558fdcd3437abe719de9250745085bf529516ae8"} Apr 24 14:25:53.746527 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:25:53.746428 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79d47f7669-4ff6w" event={"ID":"84bc3c16-3cf7-443f-887d-67eaa7bb7631","Type":"ContainerStarted","Data":"235a7d8a1186d89d01c5e797792d8987beb610c8f95da1463c478209c817ca40"} Apr 24 14:28:29.918123 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:28:29.918090 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7979n_fee20684-697d-4937-9bf8-9549ab5442bf/console-operator/2.log" Apr 24 14:28:29.918606 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:28:29.918172 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7979n_fee20684-697d-4937-9bf8-9549ab5442bf/console-operator/2.log" Apr 24 14:28:29.922188 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:28:29.922160 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jd5s_e0bd6039-b2d8-405a-b478-69690078dd73/ovn-acl-logging/0.log" Apr 24 14:28:29.922415 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:28:29.922400 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jd5s_e0bd6039-b2d8-405a-b478-69690078dd73/ovn-acl-logging/0.log" Apr 24 14:28:29.927564 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:28:29.927544 2567 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 14:33:29.938347 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:33:29.938318 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7979n_fee20684-697d-4937-9bf8-9549ab5442bf/console-operator/2.log" Apr 24 14:33:29.945762 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:33:29.945735 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7979n_fee20684-697d-4937-9bf8-9549ab5442bf/console-operator/2.log" Apr 24 14:33:29.947799 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:33:29.947776 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jd5s_e0bd6039-b2d8-405a-b478-69690078dd73/ovn-acl-logging/0.log" Apr 24 14:33:29.950215 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:33:29.950193 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jd5s_e0bd6039-b2d8-405a-b478-69690078dd73/ovn-acl-logging/0.log" Apr 24 14:38:29.969178 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:38:29.969145 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7979n_fee20684-697d-4937-9bf8-9549ab5442bf/console-operator/2.log" Apr 24 14:38:29.971336 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:38:29.971314 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7979n_fee20684-697d-4937-9bf8-9549ab5442bf/console-operator/2.log" Apr 24 14:38:29.973554 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:38:29.973530 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jd5s_e0bd6039-b2d8-405a-b478-69690078dd73/ovn-acl-logging/0.log" Apr 24 14:38:29.975429 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:38:29.975406 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jd5s_e0bd6039-b2d8-405a-b478-69690078dd73/ovn-acl-logging/0.log" Apr 24 14:43:29.991033 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:43:29.991002 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7979n_fee20684-697d-4937-9bf8-9549ab5442bf/console-operator/2.log" Apr 24 14:43:29.993264 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:43:29.993241 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7979n_fee20684-697d-4937-9bf8-9549ab5442bf/console-operator/2.log" Apr 24 14:43:29.995185 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:43:29.995164 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jd5s_e0bd6039-b2d8-405a-b478-69690078dd73/ovn-acl-logging/0.log" Apr 24 14:43:29.997302 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:43:29.997281 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jd5s_e0bd6039-b2d8-405a-b478-69690078dd73/ovn-acl-logging/0.log" Apr 24 14:48:30.013171 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:48:30.013139 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7979n_fee20684-697d-4937-9bf8-9549ab5442bf/console-operator/2.log" Apr 24 14:48:30.014604 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:48:30.014585 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7979n_fee20684-697d-4937-9bf8-9549ab5442bf/console-operator/2.log" Apr 24 14:48:30.017345 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:48:30.017327 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jd5s_e0bd6039-b2d8-405a-b478-69690078dd73/ovn-acl-logging/0.log" Apr 24 14:48:30.018661 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:48:30.018643 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jd5s_e0bd6039-b2d8-405a-b478-69690078dd73/ovn-acl-logging/0.log" Apr 24 14:53:30.034196 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:53:30.034165 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7979n_fee20684-697d-4937-9bf8-9549ab5442bf/console-operator/2.log" Apr 24 14:53:30.036638 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:53:30.036613 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7979n_fee20684-697d-4937-9bf8-9549ab5442bf/console-operator/2.log" Apr 24 14:53:30.038476 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:53:30.038456 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jd5s_e0bd6039-b2d8-405a-b478-69690078dd73/ovn-acl-logging/0.log" Apr 24 14:53:30.040784 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:53:30.040764 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jd5s_e0bd6039-b2d8-405a-b478-69690078dd73/ovn-acl-logging/0.log" Apr 24 14:58:30.054930 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:58:30.054892 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7979n_fee20684-697d-4937-9bf8-9549ab5442bf/console-operator/2.log" Apr 24 14:58:30.057399 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:58:30.057378 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7979n_fee20684-697d-4937-9bf8-9549ab5442bf/console-operator/2.log" Apr 24 14:58:30.059190 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:58:30.059171 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jd5s_e0bd6039-b2d8-405a-b478-69690078dd73/ovn-acl-logging/0.log" Apr 24 14:58:30.061640 ip-10-0-128-36 kubenswrapper[2567]: I0424 14:58:30.061622 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jd5s_e0bd6039-b2d8-405a-b478-69690078dd73/ovn-acl-logging/0.log" Apr 24 15:03:30.076574 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:03:30.076542 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7979n_fee20684-697d-4937-9bf8-9549ab5442bf/console-operator/2.log" Apr 24 15:03:30.078235 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:03:30.078215 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7979n_fee20684-697d-4937-9bf8-9549ab5442bf/console-operator/2.log" Apr 24 15:03:30.080565 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:03:30.080546 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jd5s_e0bd6039-b2d8-405a-b478-69690078dd73/ovn-acl-logging/0.log" Apr 24 15:03:30.081987 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:03:30.081969 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jd5s_e0bd6039-b2d8-405a-b478-69690078dd73/ovn-acl-logging/0.log" Apr 24 15:08:30.097802 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:08:30.097770 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7979n_fee20684-697d-4937-9bf8-9549ab5442bf/console-operator/2.log" Apr 24 15:08:30.099641 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:08:30.099617 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7979n_fee20684-697d-4937-9bf8-9549ab5442bf/console-operator/2.log" Apr 24 15:08:30.101905 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:08:30.101870 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jd5s_e0bd6039-b2d8-405a-b478-69690078dd73/ovn-acl-logging/0.log" Apr 24 15:08:30.103869 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:08:30.103847 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jd5s_e0bd6039-b2d8-405a-b478-69690078dd73/ovn-acl-logging/0.log" Apr 24 15:13:30.121745 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:13:30.121711 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7979n_fee20684-697d-4937-9bf8-9549ab5442bf/console-operator/2.log" Apr 24 15:13:30.124241 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:13:30.124217 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7979n_fee20684-697d-4937-9bf8-9549ab5442bf/console-operator/2.log" Apr 24 15:13:30.125694 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:13:30.125666 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jd5s_e0bd6039-b2d8-405a-b478-69690078dd73/ovn-acl-logging/0.log" Apr 24 15:13:30.128239 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:13:30.128217 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jd5s_e0bd6039-b2d8-405a-b478-69690078dd73/ovn-acl-logging/0.log" Apr 24 15:18:30.142384 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:18:30.142354 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7979n_fee20684-697d-4937-9bf8-9549ab5442bf/console-operator/2.log" Apr 24 15:18:30.145218 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:18:30.145196 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7979n_fee20684-697d-4937-9bf8-9549ab5442bf/console-operator/2.log" Apr 24 15:18:30.146484 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:18:30.146462 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jd5s_e0bd6039-b2d8-405a-b478-69690078dd73/ovn-acl-logging/0.log" Apr 24 15:18:30.149357 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:18:30.149333 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jd5s_e0bd6039-b2d8-405a-b478-69690078dd73/ovn-acl-logging/0.log" Apr 24 15:23:30.163271 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:23:30.163237 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7979n_fee20684-697d-4937-9bf8-9549ab5442bf/console-operator/2.log" Apr 24 15:23:30.167027 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:23:30.166999 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7979n_fee20684-697d-4937-9bf8-9549ab5442bf/console-operator/2.log" Apr 24 15:23:30.167434 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:23:30.167412 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jd5s_e0bd6039-b2d8-405a-b478-69690078dd73/ovn-acl-logging/0.log" Apr 24 15:23:30.170904 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:23:30.170876 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jd5s_e0bd6039-b2d8-405a-b478-69690078dd73/ovn-acl-logging/0.log" Apr 24 15:28:09.160714 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:09.160681 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-6gs2p_9c15bbb5-ef9b-4df0-b792-073931b97ea8/global-pull-secret-syncer/0.log" Apr 24 15:28:09.390955 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:09.390926 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-vrgvl_1a01a64d-13d5-4b58-b51d-c8bda8dbefb6/konnectivity-agent/0.log" Apr 24 15:28:09.412287 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:09.412238 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-36.ec2.internal_ec6b2c56954869eef2ac269c4c51a583/haproxy/0.log" Apr 24 15:28:13.251824 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:13.251724 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-l59rf_d7576e0a-caf3-4817-8250-2b6570598ac0/cluster-monitoring-operator/0.log" Apr 24 15:28:13.497465 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:13.497434 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-fx47g_379f1218-5ffe-4cc4-a3fe-be26d209b35b/node-exporter/0.log" Apr 24 15:28:13.521320 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:13.521245 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-fx47g_379f1218-5ffe-4cc4-a3fe-be26d209b35b/kube-rbac-proxy/0.log" Apr 24 15:28:13.548441 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:13.548417 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-fx47g_379f1218-5ffe-4cc4-a3fe-be26d209b35b/init-textfile/0.log" Apr 24 15:28:13.931079 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:13.931046 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-vc8xg_23201e36-718e-44a5-a05a-1ff9a57ff20b/prometheus-operator/0.log" Apr 24 15:28:13.949849 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:13.949824 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-vc8xg_23201e36-718e-44a5-a05a-1ff9a57ff20b/kube-rbac-proxy/0.log" Apr 24 15:28:13.975110 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:13.975084 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-8gtlg_46136cc4-31fa-4d7b-a223-af0007d2fd5a/prometheus-operator-admission-webhook/0.log" Apr 24 15:28:15.317578 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:15.317533 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-wsvfh_d63229a9-def3-4d90-95d7-748eed4531eb/networking-console-plugin/0.log" Apr 24 15:28:15.722281 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:15.722229 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7979n_fee20684-697d-4937-9bf8-9549ab5442bf/console-operator/2.log" Apr 24 15:28:15.726338 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:15.726316 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7979n_fee20684-697d-4937-9bf8-9549ab5442bf/console-operator/3.log" Apr 24 15:28:16.392932 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:16.392892 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6vl4w/perf-node-gather-daemonset-pxrnd"] Apr 24 15:28:16.393380 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:16.393318 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cbc7c460-1a4a-47be-8d93-efd98ee46239" containerName="registry" Apr 24 15:28:16.393380 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:16.393335 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbc7c460-1a4a-47be-8d93-efd98ee46239" containerName="registry" Apr 24 15:28:16.393488 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:16.393415 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="cbc7c460-1a4a-47be-8d93-efd98ee46239" containerName="registry" Apr 24 15:28:16.396254 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:16.396233 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-pxrnd" Apr 24 15:28:16.398406 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:16.398389 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6vl4w\"/\"kube-root-ca.crt\"" Apr 24 15:28:16.398514 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:16.398496 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6vl4w\"/\"openshift-service-ca.crt\"" Apr 24 15:28:16.399126 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:16.399112 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-6vl4w\"/\"default-dockercfg-9rknq\"" Apr 24 15:28:16.405400 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:16.405377 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6vl4w/perf-node-gather-daemonset-pxrnd"] Apr 24 15:28:16.428587 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:16.428555 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/71e351b9-7d37-4c13-9da2-a358170af3d9-lib-modules\") pod \"perf-node-gather-daemonset-pxrnd\" (UID: \"71e351b9-7d37-4c13-9da2-a358170af3d9\") " pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-pxrnd" Apr 24 15:28:16.428724 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:16.428596 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/71e351b9-7d37-4c13-9da2-a358170af3d9-proc\") pod \"perf-node-gather-daemonset-pxrnd\" (UID: \"71e351b9-7d37-4c13-9da2-a358170af3d9\") " pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-pxrnd" Apr 24 15:28:16.428724 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:16.428629 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/71e351b9-7d37-4c13-9da2-a358170af3d9-podres\") pod \"perf-node-gather-daemonset-pxrnd\" (UID: \"71e351b9-7d37-4c13-9da2-a358170af3d9\") " pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-pxrnd" Apr 24 15:28:16.428724 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:16.428658 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/71e351b9-7d37-4c13-9da2-a358170af3d9-sys\") pod \"perf-node-gather-daemonset-pxrnd\" (UID: \"71e351b9-7d37-4c13-9da2-a358170af3d9\") " pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-pxrnd" Apr 24 15:28:16.428724 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:16.428682 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7m5w\" (UniqueName: \"kubernetes.io/projected/71e351b9-7d37-4c13-9da2-a358170af3d9-kube-api-access-l7m5w\") pod \"perf-node-gather-daemonset-pxrnd\" (UID: \"71e351b9-7d37-4c13-9da2-a358170af3d9\") " pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-pxrnd" Apr 24 15:28:16.525884 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:16.525796 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-xw99m_0f90d0c4-7a9d-46c4-bb3a-d2d928bb1412/volume-data-source-validator/0.log" Apr 24 15:28:16.529873 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:16.529851 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7m5w\" (UniqueName: \"kubernetes.io/projected/71e351b9-7d37-4c13-9da2-a358170af3d9-kube-api-access-l7m5w\") pod \"perf-node-gather-daemonset-pxrnd\" (UID: \"71e351b9-7d37-4c13-9da2-a358170af3d9\") " pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-pxrnd" Apr 24 15:28:16.529993 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:16.529911 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/71e351b9-7d37-4c13-9da2-a358170af3d9-lib-modules\") pod \"perf-node-gather-daemonset-pxrnd\" (UID: \"71e351b9-7d37-4c13-9da2-a358170af3d9\") " pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-pxrnd" Apr 24 15:28:16.529993 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:16.529959 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/71e351b9-7d37-4c13-9da2-a358170af3d9-proc\") pod \"perf-node-gather-daemonset-pxrnd\" (UID: \"71e351b9-7d37-4c13-9da2-a358170af3d9\") " pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-pxrnd" Apr 24 15:28:16.530099 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:16.529996 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/71e351b9-7d37-4c13-9da2-a358170af3d9-podres\") pod \"perf-node-gather-daemonset-pxrnd\" (UID: \"71e351b9-7d37-4c13-9da2-a358170af3d9\") " pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-pxrnd" Apr 24 15:28:16.530099 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:16.530027 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/71e351b9-7d37-4c13-9da2-a358170af3d9-sys\") pod \"perf-node-gather-daemonset-pxrnd\" (UID: \"71e351b9-7d37-4c13-9da2-a358170af3d9\") " pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-pxrnd" Apr 24 15:28:16.530183 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:16.530097 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/71e351b9-7d37-4c13-9da2-a358170af3d9-proc\") pod \"perf-node-gather-daemonset-pxrnd\" (UID: \"71e351b9-7d37-4c13-9da2-a358170af3d9\") " pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-pxrnd" Apr 24 15:28:16.530183 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:16.530105 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/71e351b9-7d37-4c13-9da2-a358170af3d9-sys\") pod \"perf-node-gather-daemonset-pxrnd\" (UID: \"71e351b9-7d37-4c13-9da2-a358170af3d9\") " pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-pxrnd" Apr 24 15:28:16.530183 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:16.530096 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/71e351b9-7d37-4c13-9da2-a358170af3d9-lib-modules\") pod \"perf-node-gather-daemonset-pxrnd\" (UID: \"71e351b9-7d37-4c13-9da2-a358170af3d9\") " pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-pxrnd" Apr 24 15:28:16.530183 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:16.530138 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/71e351b9-7d37-4c13-9da2-a358170af3d9-podres\") pod \"perf-node-gather-daemonset-pxrnd\" (UID: \"71e351b9-7d37-4c13-9da2-a358170af3d9\") " pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-pxrnd" Apr 24 15:28:16.537059 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:16.537037 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7m5w\" (UniqueName: \"kubernetes.io/projected/71e351b9-7d37-4c13-9da2-a358170af3d9-kube-api-access-l7m5w\") pod \"perf-node-gather-daemonset-pxrnd\" (UID: \"71e351b9-7d37-4c13-9da2-a358170af3d9\") " pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-pxrnd" Apr 24 15:28:16.706322 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:16.706230 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-pxrnd" Apr 24 15:28:16.820411 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:16.820378 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6vl4w/perf-node-gather-daemonset-pxrnd"] Apr 24 15:28:16.823218 ip-10-0-128-36 kubenswrapper[2567]: W0424 15:28:16.823190 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod71e351b9_7d37_4c13_9da2_a358170af3d9.slice/crio-21620c832728bf1c141cf81d4a2f03e726d7788cd17823610fb5950f9264ac3c WatchSource:0}: Error finding container 21620c832728bf1c141cf81d4a2f03e726d7788cd17823610fb5950f9264ac3c: Status 404 returned error can't find the container with id 21620c832728bf1c141cf81d4a2f03e726d7788cd17823610fb5950f9264ac3c Apr 24 15:28:16.824693 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:16.824679 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 15:28:17.396123 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:17.396095 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qcxs6_743626cc-db6b-4ae5-a8dc-cceaa1cb8be0/dns/0.log" Apr 24 15:28:17.418503 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:17.418475 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qcxs6_743626cc-db6b-4ae5-a8dc-cceaa1cb8be0/kube-rbac-proxy/0.log" Apr 24 15:28:17.487397 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:17.487366 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-p222m_fbd73a0c-457f-455c-b2ca-ef248d74efc8/dns-node-resolver/0.log" Apr 24 15:28:17.560236 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:17.560201 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-pxrnd" event={"ID":"71e351b9-7d37-4c13-9da2-a358170af3d9","Type":"ContainerStarted","Data":"c5aabad0063d3e5a351c8efb08c36ab21c836f1fbfecccb3be8f9a7d47f77586"} Apr 24 15:28:17.560398 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:17.560240 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-pxrnd" event={"ID":"71e351b9-7d37-4c13-9da2-a358170af3d9","Type":"ContainerStarted","Data":"21620c832728bf1c141cf81d4a2f03e726d7788cd17823610fb5950f9264ac3c"} Apr 24 15:28:17.560398 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:17.560341 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-pxrnd" Apr 24 15:28:17.575121 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:17.575077 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-pxrnd" podStartSLOduration=1.575062137 podStartE2EDuration="1.575062137s" podCreationTimestamp="2026-04-24 15:28:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 15:28:17.57432706 +0000 UTC m=+3888.164479782" watchObservedRunningTime="2026-04-24 15:28:17.575062137 +0000 UTC m=+3888.165214859" Apr 24 15:28:17.992792 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:17.992758 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-m4w28_2a44ddf6-7291-4385-a9e5-9e6dd777407e/node-ca/0.log" Apr 24 15:28:18.681110 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:18.681078 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-78bb745fb6-x54tc_e88e3b18-0568-4c3d-9672-99f01f7456b4/router/0.log" Apr 24 15:28:19.025032 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:19.024949 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-95gsb_d834bc9a-bc43-42cc-82ed-5b3a77d4da5d/serve-healthcheck-canary/0.log" Apr 24 15:28:19.379014 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:19.378985 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-jvth8_0a1850b6-1bc3-4ff6-a567-7c2e8813a59d/insights-operator/0.log" Apr 24 15:28:19.462763 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:19.462735 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fw95k_48a81750-849c-4b96-b2cb-1463543ee44e/kube-rbac-proxy/0.log" Apr 24 15:28:19.487180 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:19.487154 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fw95k_48a81750-849c-4b96-b2cb-1463543ee44e/exporter/0.log" Apr 24 15:28:19.508311 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:19.508282 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fw95k_48a81750-849c-4b96-b2cb-1463543ee44e/extractor/0.log" Apr 24 15:28:23.572166 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:23.572129 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-6vl4w/perf-node-gather-daemonset-pxrnd" Apr 24 15:28:26.335172 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:26.335149 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-cwc6x_114b4da4-48aa-45ef-9304-25ef4821570d/migrator/0.log" Apr 24 15:28:26.356422 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:26.356397 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-cwc6x_114b4da4-48aa-45ef-9304-25ef4821570d/graceful-termination/0.log" Apr 24 15:28:26.713105 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:26.713075 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-28tw4_efd840e5-21ca-4b28-993c-7e8bd0b9a822/kube-storage-version-migrator-operator/1.log" Apr 24 15:28:26.713922 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:26.713893 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-28tw4_efd840e5-21ca-4b28-993c-7e8bd0b9a822/kube-storage-version-migrator-operator/0.log" Apr 24 15:28:27.876433 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:27.876366 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cb58k_04393dd2-c684-4592-bc88-2223fac95a11/kube-multus-additional-cni-plugins/0.log" Apr 24 15:28:27.897158 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:27.897114 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cb58k_04393dd2-c684-4592-bc88-2223fac95a11/egress-router-binary-copy/0.log" Apr 24 15:28:27.922678 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:27.922650 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cb58k_04393dd2-c684-4592-bc88-2223fac95a11/cni-plugins/0.log" Apr 24 15:28:27.949701 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:27.949678 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cb58k_04393dd2-c684-4592-bc88-2223fac95a11/bond-cni-plugin/0.log" Apr 24 15:28:27.971699 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:27.971677 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cb58k_04393dd2-c684-4592-bc88-2223fac95a11/routeoverride-cni/0.log" Apr 24 15:28:27.991628 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:27.991605 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cb58k_04393dd2-c684-4592-bc88-2223fac95a11/whereabouts-cni-bincopy/0.log" Apr 24 15:28:28.016230 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:28.016201 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cb58k_04393dd2-c684-4592-bc88-2223fac95a11/whereabouts-cni/0.log" Apr 24 15:28:28.210039 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:28.209968 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h5ptb_b5fcbe2e-31cb-4da0-bddb-8eec79e0ca73/kube-multus/0.log" Apr 24 15:28:28.255222 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:28.255187 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-ct9nz_952d5757-28bc-4940-9fa6-4a50ffff6476/network-metrics-daemon/0.log" Apr 24 15:28:28.276681 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:28.276656 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-ct9nz_952d5757-28bc-4940-9fa6-4a50ffff6476/kube-rbac-proxy/0.log" Apr 24 15:28:29.094642 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:29.094614 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jd5s_e0bd6039-b2d8-405a-b478-69690078dd73/ovn-controller/0.log" Apr 24 15:28:29.113784 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:29.113758 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jd5s_e0bd6039-b2d8-405a-b478-69690078dd73/ovn-acl-logging/0.log" Apr 24 15:28:29.133393 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:29.133356 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jd5s_e0bd6039-b2d8-405a-b478-69690078dd73/ovn-acl-logging/1.log" Apr 24 15:28:29.150214 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:29.150187 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jd5s_e0bd6039-b2d8-405a-b478-69690078dd73/kube-rbac-proxy-node/0.log" Apr 24 15:28:29.169199 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:29.169173 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jd5s_e0bd6039-b2d8-405a-b478-69690078dd73/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 15:28:29.189189 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:29.189167 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jd5s_e0bd6039-b2d8-405a-b478-69690078dd73/northd/0.log" Apr 24 15:28:29.210674 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:29.210647 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jd5s_e0bd6039-b2d8-405a-b478-69690078dd73/nbdb/0.log" Apr 24 15:28:29.231284 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:29.231260 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jd5s_e0bd6039-b2d8-405a-b478-69690078dd73/sbdb/0.log" Apr 24 15:28:29.320592 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:29.320546 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jd5s_e0bd6039-b2d8-405a-b478-69690078dd73/ovnkube-controller/0.log" Apr 24 15:28:30.183960 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:30.183931 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7979n_fee20684-697d-4937-9bf8-9549ab5442bf/console-operator/2.log" Apr 24 15:28:30.187679 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:30.187659 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7979n_fee20684-697d-4937-9bf8-9549ab5442bf/console-operator/2.log" Apr 24 15:28:30.188219 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:30.188205 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jd5s_e0bd6039-b2d8-405a-b478-69690078dd73/ovn-acl-logging/0.log" Apr 24 15:28:30.195093 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:30.195075 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8jd5s_e0bd6039-b2d8-405a-b478-69690078dd73/ovn-acl-logging/0.log" Apr 24 15:28:31.149650 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:31.149617 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-rt2p5_5b2b1cb4-7903-4e07-aa4d-25298b638889/check-endpoints/0.log" Apr 24 15:28:31.182166 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:31.182131 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-57rkt_f33ebad9-63f4-4a25-865f-68c02ee70c85/network-check-target-container/0.log" Apr 24 15:28:32.196024 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:32.195991 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-m2lt2_5fc12c1f-0a1b-49b5-b20f-0208c496ba66/iptables-alerter/0.log" Apr 24 15:28:32.933307 ip-10-0-128-36 kubenswrapper[2567]: I0424 15:28:32.933278 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-vl2zx_1fba2eb1-536b-4442-bf40-5af241dd98fd/tuned/0.log"