Mar 18 16:42:04.256387 ip-10-0-132-224 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Mar 18 16:42:04.256403 ip-10-0-132-224 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Mar 18 16:42:04.256412 ip-10-0-132-224 systemd[1]: kubelet.service: Failed with result 'resources'. Mar 18 16:42:04.256738 ip-10-0-132-224 systemd[1]: Failed to start Kubernetes Kubelet. Mar 18 16:42:15.618470 ip-10-0-132-224 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Mar 18 16:42:15.618485 ip-10-0-132-224 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 228771396ca9467384465655f6e2875a -- Mar 18 16:44:37.092561 ip-10-0-132-224 systemd[1]: Starting Kubernetes Kubelet... Mar 18 16:44:37.615475 ip-10-0-132-224 kubenswrapper[2536]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 16:44:37.615475 ip-10-0-132-224 kubenswrapper[2536]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 18 16:44:37.615475 ip-10-0-132-224 kubenswrapper[2536]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 16:44:37.615475 ip-10-0-132-224 kubenswrapper[2536]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 18 16:44:37.615475 ip-10-0-132-224 kubenswrapper[2536]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 16:44:37.616428 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.616334 2536 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 18 16:44:37.619602 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619586 2536 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:37.619602 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619601 2536 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:37.619665 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619605 2536 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:37.619665 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619609 2536 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:37.619665 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619613 2536 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:37.619665 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619616 2536 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:37.619665 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619619 2536 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:37.619665 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619622 2536 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:37.619665 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619625 2536 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:37.619665 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619627 2536 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:37.619665 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619630 2536 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:37.619665 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619633 2536 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:37.619665 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619636 2536 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:37.619665 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619639 2536 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:37.619665 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619642 2536 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:37.619665 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619645 2536 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:37.619665 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619647 2536 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:37.619665 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619650 2536 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:37.619665 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619654 2536 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:37.619665 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619658 2536 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:37.619665 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619661 2536 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:37.620124 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619664 2536 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:37.620124 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619668 2536 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:37.620124 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619671 2536 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:37.620124 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619674 2536 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:37.620124 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619677 2536 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:37.620124 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619680 2536 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:37.620124 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619712 2536 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:37.620124 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619716 2536 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:37.620124 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619719 2536 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:37.620124 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619722 2536 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:37.620124 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619725 2536 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:37.620124 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619729 2536 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:37.620124 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619731 2536 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:37.620124 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619734 2536 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:37.620124 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619736 2536 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:37.620124 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619739 2536 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:37.620124 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619741 2536 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:37.620124 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619744 2536 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:37.620124 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619746 2536 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:37.620124 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619748 2536 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:37.620617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619751 2536 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:37.620617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619754 2536 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:37.620617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619756 2536 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:37.620617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619759 2536 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:37.620617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619761 2536 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:37.620617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619764 2536 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:37.620617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619766 2536 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:37.620617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619769 2536 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:37.620617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619772 2536 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:37.620617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619775 2536 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:37.620617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619778 2536 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:37.620617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619780 2536 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:37.620617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619783 2536 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:37.620617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619786 2536 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:37.620617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619789 2536 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:37.620617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619792 2536 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:37.620617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619794 2536 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:37.620617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619797 2536 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:37.620617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619799 2536 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:37.620617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619802 2536 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:37.621094 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619805 2536 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:37.621094 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619807 2536 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:37.621094 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619810 2536 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:37.621094 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619812 2536 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:37.621094 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619815 2536 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:37.621094 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619821 2536 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:37.621094 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619824 2536 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:37.621094 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619827 2536 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:37.621094 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619829 2536 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:37.621094 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619832 2536 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:37.621094 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619835 2536 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:37.621094 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619837 2536 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:37.621094 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619839 2536 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:37.621094 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619842 2536 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:37.621094 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619844 2536 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:37.621094 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619848 2536 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:37.621094 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619851 2536 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:37.621094 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619853 2536 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:37.621094 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619856 2536 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:37.621094 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619858 2536 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:37.621589 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619863 2536 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:37.621589 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619866 2536 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:37.621589 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619868 2536 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:37.621589 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619871 2536 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:37.621589 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.619873 2536 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:37.621589 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620317 2536 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:37.621589 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620323 2536 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:37.621589 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620326 2536 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:37.621589 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620329 2536 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:37.621589 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620332 2536 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:37.621589 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620334 2536 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:37.621589 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620337 2536 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:37.621589 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620340 2536 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:37.621589 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620343 2536 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:37.621589 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620345 2536 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:37.621589 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620348 2536 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:37.621589 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620350 2536 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:37.621589 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620361 2536 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:37.621589 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620365 2536 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:37.622028 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620368 2536 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:37.622028 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620371 2536 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:37.622028 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620374 2536 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:37.622028 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620377 2536 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:37.622028 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620380 2536 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:37.622028 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620382 2536 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:37.622028 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620385 2536 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:37.622028 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620388 2536 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:37.622028 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620390 2536 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:37.622028 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620393 2536 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:37.622028 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620396 2536 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:37.622028 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620398 2536 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:37.622028 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620401 2536 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:37.622028 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620403 2536 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:37.622028 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620406 2536 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:37.622028 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620408 2536 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:37.622028 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620411 2536 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:37.622028 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620414 2536 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:37.622028 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620417 2536 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:37.622028 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620420 2536 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:37.622527 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620422 2536 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:37.622527 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620425 2536 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:37.622527 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620428 2536 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:37.622527 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620430 2536 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:37.622527 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620432 2536 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:37.622527 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620435 2536 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:37.622527 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620438 2536 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:37.622527 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620440 2536 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:37.622527 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620442 2536 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:37.622527 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620445 2536 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:37.622527 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620447 2536 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:37.622527 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620455 2536 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:37.622527 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620458 2536 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:37.622527 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620460 2536 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:37.622527 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620463 2536 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:37.622527 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620465 2536 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:37.622527 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620468 2536 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:37.622527 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620470 2536 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:37.622527 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620473 2536 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:37.622527 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620475 2536 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:37.623007 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620477 2536 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:37.623007 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620480 2536 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:37.623007 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620483 2536 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:37.623007 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620485 2536 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:37.623007 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620488 2536 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:37.623007 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620491 2536 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:37.623007 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620493 2536 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:37.623007 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620496 2536 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:37.623007 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620499 2536 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:37.623007 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620501 2536 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:37.623007 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620504 2536 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:37.623007 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620506 2536 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:37.623007 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620509 2536 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:37.623007 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620512 2536 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:37.623007 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620514 2536 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:37.623007 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620517 2536 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:37.623007 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620519 2536 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:37.623007 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620521 2536 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:37.623007 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620524 2536 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:37.623007 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620527 2536 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:37.623501 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620529 2536 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:37.623501 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620531 2536 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:37.623501 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620534 2536 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:37.623501 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620536 2536 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:37.623501 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620546 2536 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:37.623501 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620550 2536 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:37.623501 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620553 2536 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:37.623501 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620556 2536 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:37.623501 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620559 2536 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:37.623501 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620562 2536 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:37.623501 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620564 2536 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:37.623501 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.620567 2536 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:37.623501 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.622896 2536 flags.go:64] FLAG: --address="0.0.0.0" Mar 18 16:44:37.623501 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.622909 2536 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 18 16:44:37.623501 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.622920 2536 flags.go:64] FLAG: --anonymous-auth="true" Mar 18 16:44:37.623501 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.622925 2536 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 18 16:44:37.623501 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.622930 2536 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 18 16:44:37.623501 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.622934 2536 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 18 16:44:37.623501 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.622938 2536 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 18 16:44:37.623501 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.622943 2536 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 18 16:44:37.623501 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.622946 2536 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 18 16:44:37.624018 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.622949 2536 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 18 16:44:37.624018 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.622953 2536 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 18 16:44:37.624018 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.622956 2536 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 18 16:44:37.624018 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.622960 2536 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 18 16:44:37.624018 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.622963 2536 flags.go:64] FLAG: --cgroup-root="" Mar 18 16:44:37.624018 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.622966 2536 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 18 16:44:37.624018 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.622969 2536 flags.go:64] FLAG: --client-ca-file="" Mar 18 16:44:37.624018 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.622971 2536 flags.go:64] FLAG: --cloud-config="" Mar 18 16:44:37.624018 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.622974 2536 flags.go:64] FLAG: --cloud-provider="external" Mar 18 16:44:37.624018 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.622977 2536 flags.go:64] FLAG: --cluster-dns="[]" Mar 18 16:44:37.624018 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.622984 2536 flags.go:64] FLAG: --cluster-domain="" Mar 18 16:44:37.624018 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.622987 2536 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 18 16:44:37.624018 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.622990 2536 flags.go:64] FLAG: --config-dir="" Mar 18 16:44:37.624018 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.622993 2536 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 18 16:44:37.624018 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.622997 2536 flags.go:64] FLAG: --container-log-max-files="5" Mar 18 16:44:37.624018 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623001 2536 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 18 16:44:37.624018 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623010 2536 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 18 16:44:37.624018 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623013 2536 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 18 16:44:37.624018 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623017 2536 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 18 16:44:37.624018 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623020 2536 flags.go:64] FLAG: --contention-profiling="false" Mar 18 16:44:37.624018 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623023 2536 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 18 16:44:37.624018 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623026 2536 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 18 16:44:37.624018 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623029 2536 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 18 16:44:37.624018 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623032 2536 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 18 16:44:37.624018 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623036 2536 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 18 16:44:37.624637 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623039 2536 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 18 16:44:37.624637 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623042 2536 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 18 16:44:37.624637 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623045 2536 flags.go:64] FLAG: --enable-load-reader="false" Mar 18 16:44:37.624637 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623048 2536 flags.go:64] FLAG: --enable-server="true" Mar 18 16:44:37.624637 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623051 2536 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 18 16:44:37.624637 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623058 2536 flags.go:64] FLAG: --event-burst="100" Mar 18 16:44:37.624637 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623061 2536 flags.go:64] FLAG: --event-qps="50" Mar 18 16:44:37.624637 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623064 2536 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 18 16:44:37.624637 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623067 2536 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 18 16:44:37.624637 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623070 2536 flags.go:64] FLAG: --eviction-hard="" Mar 18 16:44:37.624637 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623074 2536 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 18 16:44:37.624637 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623077 2536 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 18 16:44:37.624637 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623080 2536 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 18 16:44:37.624637 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623082 2536 flags.go:64] FLAG: --eviction-soft="" Mar 18 16:44:37.624637 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623085 2536 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 18 16:44:37.624637 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623088 2536 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 18 16:44:37.624637 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623091 2536 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 18 16:44:37.624637 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623094 2536 flags.go:64] FLAG: --experimental-mounter-path="" Mar 18 16:44:37.624637 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623097 2536 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 18 16:44:37.624637 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623100 2536 flags.go:64] FLAG: --fail-swap-on="true" Mar 18 16:44:37.624637 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623103 2536 flags.go:64] FLAG: --feature-gates="" Mar 18 16:44:37.624637 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623106 2536 flags.go:64] FLAG: --file-check-frequency="20s" Mar 18 16:44:37.624637 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623110 2536 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 18 16:44:37.624637 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623113 2536 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 18 16:44:37.624637 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623124 2536 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 18 16:44:37.625252 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623127 2536 flags.go:64] FLAG: --healthz-port="10248" Mar 18 16:44:37.625252 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623130 2536 flags.go:64] FLAG: --help="false" Mar 18 16:44:37.625252 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623133 2536 flags.go:64] FLAG: --hostname-override="ip-10-0-132-224.ec2.internal" Mar 18 16:44:37.625252 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623136 2536 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 18 16:44:37.625252 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623139 2536 flags.go:64] FLAG: --http-check-frequency="20s" Mar 18 16:44:37.625252 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623142 2536 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Mar 18 16:44:37.625252 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623145 2536 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Mar 18 16:44:37.625252 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623149 2536 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 18 16:44:37.625252 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623152 2536 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 18 16:44:37.625252 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623155 2536 flags.go:64] FLAG: --image-service-endpoint="" Mar 18 16:44:37.625252 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623158 2536 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 18 16:44:37.625252 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623161 2536 flags.go:64] FLAG: --kube-api-burst="100" Mar 18 16:44:37.625252 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623164 2536 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 18 16:44:37.625252 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623167 2536 flags.go:64] FLAG: --kube-api-qps="50" Mar 18 16:44:37.625252 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623170 2536 flags.go:64] FLAG: --kube-reserved="" Mar 18 16:44:37.625252 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623173 2536 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 18 16:44:37.625252 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623176 2536 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 18 16:44:37.625252 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623179 2536 flags.go:64] FLAG: --kubelet-cgroups="" Mar 18 16:44:37.625252 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623181 2536 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 18 16:44:37.625252 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623185 2536 flags.go:64] FLAG: --lock-file="" Mar 18 16:44:37.625252 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623188 2536 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 18 16:44:37.625252 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623191 2536 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 18 16:44:37.625252 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623194 2536 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 18 16:44:37.625252 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623199 2536 flags.go:64] FLAG: --log-json-split-stream="false" Mar 18 16:44:37.625841 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623202 2536 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 18 16:44:37.625841 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623205 2536 flags.go:64] FLAG: --log-text-split-stream="false" Mar 18 16:44:37.625841 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623208 2536 flags.go:64] FLAG: --logging-format="text" Mar 18 16:44:37.625841 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623211 2536 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 18 16:44:37.625841 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623214 2536 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 18 16:44:37.625841 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623218 2536 flags.go:64] FLAG: --manifest-url="" Mar 18 16:44:37.625841 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623221 2536 flags.go:64] FLAG: --manifest-url-header="" Mar 18 16:44:37.625841 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623226 2536 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 18 16:44:37.625841 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623234 2536 flags.go:64] FLAG: --max-open-files="1000000" Mar 18 16:44:37.625841 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623239 2536 flags.go:64] FLAG: --max-pods="110" Mar 18 16:44:37.625841 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623242 2536 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 18 16:44:37.625841 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623245 2536 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 18 16:44:37.625841 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623248 2536 flags.go:64] FLAG: --memory-manager-policy="None" Mar 18 16:44:37.625841 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623251 2536 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 18 16:44:37.625841 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623254 2536 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 18 16:44:37.625841 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623257 2536 flags.go:64] FLAG: --node-ip="0.0.0.0" Mar 18 16:44:37.625841 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623260 2536 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Mar 18 16:44:37.625841 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623279 2536 flags.go:64] FLAG: --node-status-max-images="50" Mar 18 16:44:37.625841 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623284 2536 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 18 16:44:37.625841 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623287 2536 flags.go:64] FLAG: --oom-score-adj="-999" Mar 18 16:44:37.625841 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623290 2536 flags.go:64] FLAG: --pod-cidr="" Mar 18 16:44:37.625841 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623293 2536 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b3115b2610585407ab0742648cfbe39c72f57482889f0e778f5ac6fdc482217b" Mar 18 16:44:37.625841 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623298 2536 flags.go:64] FLAG: --pod-manifest-path="" Mar 18 16:44:37.626397 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623301 2536 flags.go:64] FLAG: --pod-max-pids="-1" Mar 18 16:44:37.626397 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623304 2536 flags.go:64] FLAG: --pods-per-core="0" Mar 18 16:44:37.626397 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623307 2536 flags.go:64] FLAG: --port="10250" Mar 18 16:44:37.626397 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623311 2536 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 18 16:44:37.626397 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623313 2536 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-02d22b5e384c50046" Mar 18 16:44:37.626397 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623317 2536 flags.go:64] FLAG: --qos-reserved="" Mar 18 16:44:37.626397 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623320 2536 flags.go:64] FLAG: --read-only-port="10255" Mar 18 16:44:37.626397 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623323 2536 flags.go:64] FLAG: --register-node="true" Mar 18 16:44:37.626397 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623325 2536 flags.go:64] FLAG: --register-schedulable="true" Mar 18 16:44:37.626397 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623328 2536 flags.go:64] FLAG: --register-with-taints="" Mar 18 16:44:37.626397 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623332 2536 flags.go:64] FLAG: --registry-burst="10" Mar 18 16:44:37.626397 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623335 2536 flags.go:64] FLAG: --registry-qps="5" Mar 18 16:44:37.626397 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623337 2536 flags.go:64] FLAG: --reserved-cpus="" Mar 18 16:44:37.626397 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623340 2536 flags.go:64] FLAG: --reserved-memory="" Mar 18 16:44:37.626397 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623344 2536 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 18 16:44:37.626397 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623347 2536 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 18 16:44:37.626397 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623350 2536 flags.go:64] FLAG: --rotate-certificates="false" Mar 18 16:44:37.626397 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623353 2536 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 18 16:44:37.626397 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623362 2536 flags.go:64] FLAG: --runonce="false" Mar 18 16:44:37.626397 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623365 2536 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 18 16:44:37.626397 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623368 2536 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 18 16:44:37.626397 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623372 2536 flags.go:64] FLAG: --seccomp-default="false" Mar 18 16:44:37.626397 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623374 2536 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 18 16:44:37.626397 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623377 2536 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 18 16:44:37.626397 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623380 2536 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 18 16:44:37.626397 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623383 2536 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 18 16:44:37.627040 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623386 2536 flags.go:64] FLAG: --storage-driver-password="root" Mar 18 16:44:37.627040 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623389 2536 flags.go:64] FLAG: --storage-driver-secure="false" Mar 18 16:44:37.627040 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623392 2536 flags.go:64] FLAG: --storage-driver-table="stats" Mar 18 16:44:37.627040 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623395 2536 flags.go:64] FLAG: --storage-driver-user="root" Mar 18 16:44:37.627040 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623404 2536 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 18 16:44:37.627040 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623407 2536 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 18 16:44:37.627040 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623410 2536 flags.go:64] FLAG: --system-cgroups="" Mar 18 16:44:37.627040 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623413 2536 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 18 16:44:37.627040 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623418 2536 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 18 16:44:37.627040 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623421 2536 flags.go:64] FLAG: --tls-cert-file="" Mar 18 16:44:37.627040 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623424 2536 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 18 16:44:37.627040 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623431 2536 flags.go:64] FLAG: --tls-min-version="" Mar 18 16:44:37.627040 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623434 2536 flags.go:64] FLAG: --tls-private-key-file="" Mar 18 16:44:37.627040 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623437 2536 flags.go:64] FLAG: --topology-manager-policy="none" Mar 18 16:44:37.627040 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623440 2536 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 18 16:44:37.627040 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623443 2536 flags.go:64] FLAG: --topology-manager-scope="container" Mar 18 16:44:37.627040 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623446 2536 flags.go:64] FLAG: --v="2" Mar 18 16:44:37.627040 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623450 2536 flags.go:64] FLAG: --version="false" Mar 18 16:44:37.627040 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623454 2536 flags.go:64] FLAG: --vmodule="" Mar 18 16:44:37.627040 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623458 2536 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 18 16:44:37.627040 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.623467 2536 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 18 16:44:37.627040 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623590 2536 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:37.627040 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623594 2536 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:37.627040 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623597 2536 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:37.627617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623600 2536 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:37.627617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623611 2536 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:37.627617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623614 2536 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:37.627617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623617 2536 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:37.627617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623620 2536 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:37.627617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623624 2536 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:37.627617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623628 2536 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:37.627617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623631 2536 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:37.627617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623635 2536 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:37.627617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623637 2536 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:37.627617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623640 2536 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:37.627617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623643 2536 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:37.627617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623646 2536 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:37.627617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623648 2536 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:37.627617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623651 2536 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:37.627617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623653 2536 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:37.627617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623656 2536 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:37.627617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623659 2536 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:37.627617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623661 2536 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:37.627617 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623664 2536 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:37.628157 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623666 2536 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:37.628157 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623668 2536 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:37.628157 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623671 2536 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:37.628157 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623673 2536 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:37.628157 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623676 2536 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:37.628157 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623678 2536 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:37.628157 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623681 2536 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:37.628157 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623683 2536 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:37.628157 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623686 2536 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:37.628157 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623689 2536 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:37.628157 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623691 2536 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:37.628157 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623694 2536 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:37.628157 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623696 2536 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:37.628157 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623698 2536 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:37.628157 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623706 2536 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:37.628157 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623709 2536 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:37.628157 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623712 2536 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:37.628157 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623714 2536 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:37.628157 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623717 2536 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:37.628157 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623721 2536 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:37.628657 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623725 2536 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:37.628657 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623728 2536 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:37.628657 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623730 2536 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:37.628657 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623733 2536 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:37.628657 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623735 2536 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:37.628657 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623738 2536 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:37.628657 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623740 2536 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:37.628657 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623743 2536 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:37.628657 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623745 2536 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:37.628657 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623748 2536 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:37.628657 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623750 2536 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:37.628657 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623753 2536 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:37.628657 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623755 2536 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:37.628657 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623757 2536 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:37.628657 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623760 2536 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:37.628657 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623763 2536 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:37.628657 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623765 2536 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:37.628657 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623767 2536 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:37.628657 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623770 2536 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:37.629187 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623772 2536 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:37.629187 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623775 2536 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:37.629187 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623778 2536 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:37.629187 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623780 2536 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:37.629187 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623783 2536 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:37.629187 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623785 2536 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:37.629187 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623788 2536 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:37.629187 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623790 2536 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:37.629187 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623798 2536 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:37.629187 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623801 2536 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:37.629187 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623804 2536 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:37.629187 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623806 2536 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:37.629187 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623809 2536 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:37.629187 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623811 2536 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:37.629187 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623813 2536 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:37.629187 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623816 2536 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:37.629187 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623819 2536 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:37.629187 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623821 2536 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:37.629187 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623823 2536 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:37.629187 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623826 2536 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:37.629689 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623828 2536 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:37.629689 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623831 2536 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:37.629689 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623833 2536 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:37.629689 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.623836 2536 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:37.629689 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.624778 2536 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 18 16:44:37.633128 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.633111 2536 server.go:530] "Kubelet version" kubeletVersion="v1.33.8" Mar 18 16:44:37.633166 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.633129 2536 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 18 16:44:37.633197 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633186 2536 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:37.633197 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633191 2536 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:37.633197 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633194 2536 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:37.633197 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633197 2536 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:37.633327 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633201 2536 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:37.633327 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633204 2536 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:37.633327 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633207 2536 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:37.633327 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633210 2536 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:37.633327 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633213 2536 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:37.633327 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633216 2536 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:37.633327 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633219 2536 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:37.633327 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633221 2536 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:37.633327 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633224 2536 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:37.633327 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633227 2536 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:37.633327 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633230 2536 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:37.633327 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633233 2536 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:37.633327 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633237 2536 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:37.633327 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633241 2536 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:37.633327 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633244 2536 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:37.633327 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633247 2536 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:37.633327 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633250 2536 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:37.633327 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633253 2536 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:37.633327 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633256 2536 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:37.633774 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633258 2536 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:37.633774 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633261 2536 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:37.633774 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633264 2536 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:37.633774 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633281 2536 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:37.633774 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633284 2536 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:37.633774 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633287 2536 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:37.633774 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633290 2536 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:37.633774 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633292 2536 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:37.633774 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633295 2536 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:37.633774 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633298 2536 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:37.633774 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633300 2536 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:37.633774 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633303 2536 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:37.633774 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633305 2536 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:37.633774 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633308 2536 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:37.633774 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633311 2536 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:37.633774 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633314 2536 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:37.633774 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633317 2536 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:37.633774 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633319 2536 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:37.633774 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633322 2536 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:37.633774 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633324 2536 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:37.634257 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633327 2536 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:37.634257 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633329 2536 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:37.634257 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633332 2536 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:37.634257 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633336 2536 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:37.634257 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633338 2536 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:37.634257 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633341 2536 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:37.634257 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633344 2536 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:37.634257 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633346 2536 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:37.634257 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633348 2536 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:37.634257 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633351 2536 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:37.634257 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633353 2536 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:37.634257 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633356 2536 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:37.634257 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633359 2536 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:37.634257 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633361 2536 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:37.634257 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633363 2536 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:37.634257 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633366 2536 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:37.634257 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633368 2536 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:37.634257 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633371 2536 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:37.634257 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633373 2536 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:37.634257 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633376 2536 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:37.634747 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633378 2536 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:37.634747 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633382 2536 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:37.634747 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633385 2536 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:37.634747 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633389 2536 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:37.634747 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633392 2536 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:37.634747 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633394 2536 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:37.634747 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633396 2536 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:37.634747 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633399 2536 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:37.634747 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633401 2536 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:37.634747 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633404 2536 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:37.634747 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633406 2536 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:37.634747 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633409 2536 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:37.634747 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633411 2536 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:37.634747 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633414 2536 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:37.634747 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633416 2536 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:37.634747 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633419 2536 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:37.634747 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633422 2536 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:37.634747 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633425 2536 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:37.634747 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633427 2536 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:37.635242 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633430 2536 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:37.635242 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633432 2536 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:37.635242 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633435 2536 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:37.635242 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633437 2536 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:37.635242 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.633442 2536 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 18 16:44:37.635242 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633542 2536 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:37.635242 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633547 2536 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:37.635242 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633549 2536 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:37.635242 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633552 2536 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:37.635242 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633555 2536 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:37.635242 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633559 2536 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:37.635242 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633561 2536 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:37.635242 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633564 2536 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:37.635242 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633567 2536 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:37.635242 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633570 2536 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:37.635627 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633572 2536 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:37.635627 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633575 2536 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:37.635627 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633577 2536 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:37.635627 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633580 2536 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:37.635627 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633582 2536 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:37.635627 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633585 2536 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:37.635627 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633588 2536 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:37.635627 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633590 2536 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:37.635627 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633592 2536 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:37.635627 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633595 2536 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:37.635627 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633597 2536 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:37.635627 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633600 2536 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:37.635627 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633602 2536 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:37.635627 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633605 2536 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:37.635627 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633607 2536 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:37.635627 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633610 2536 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:37.635627 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633613 2536 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:37.635627 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633615 2536 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:37.635627 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633619 2536 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:37.635627 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633622 2536 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:37.636119 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633625 2536 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:37.636119 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633628 2536 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:37.636119 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633631 2536 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:37.636119 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633633 2536 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:37.636119 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633636 2536 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:37.636119 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633638 2536 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:37.636119 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633641 2536 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:37.636119 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633643 2536 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:37.636119 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633646 2536 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:37.636119 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633649 2536 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:37.636119 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633651 2536 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:37.636119 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633655 2536 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:37.636119 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633658 2536 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:37.636119 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633662 2536 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:37.636119 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633665 2536 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:37.636119 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633667 2536 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:37.636119 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633670 2536 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:37.636119 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633673 2536 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:37.636119 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633676 2536 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:37.636600 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633678 2536 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:37.636600 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633681 2536 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:37.636600 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633683 2536 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:37.636600 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633686 2536 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:37.636600 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633689 2536 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:37.636600 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633691 2536 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:37.636600 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633694 2536 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:37.636600 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633696 2536 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:37.636600 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633699 2536 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:37.636600 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633702 2536 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:37.636600 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633705 2536 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:37.636600 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633707 2536 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:37.636600 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633710 2536 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:37.636600 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633712 2536 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:37.636600 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633715 2536 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:37.636600 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633717 2536 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:37.636600 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633720 2536 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:37.636600 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633722 2536 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:37.636600 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633724 2536 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:37.636600 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633727 2536 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:37.637072 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633729 2536 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:37.637072 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633732 2536 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:37.637072 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633735 2536 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:37.637072 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633737 2536 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:37.637072 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633739 2536 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:37.637072 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633742 2536 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:37.637072 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633744 2536 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:37.637072 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633747 2536 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:37.637072 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633750 2536 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:37.637072 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633752 2536 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:37.637072 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633754 2536 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:37.637072 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633757 2536 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:37.637072 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633759 2536 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:37.637072 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633762 2536 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:37.637072 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633764 2536 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:37.637072 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633767 2536 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:37.637072 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:37.633769 2536 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:37.637490 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.633774 2536 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 18 16:44:37.637490 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.634414 2536 server.go:962] "Client rotation is on, will bootstrap in background" Mar 18 16:44:37.637490 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.636346 2536 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 18 16:44:37.637490 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.637373 2536 server.go:1019] "Starting client certificate rotation" Mar 18 16:44:37.637844 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.637826 2536 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 18 16:44:37.638555 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.638544 2536 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 18 16:44:37.673565 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.673530 2536 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 16:44:37.676184 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.676165 2536 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 16:44:37.695749 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.695724 2536 log.go:25] "Validated CRI v1 runtime API" Mar 18 16:44:37.698698 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.698678 2536 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Mar 18 16:44:37.701212 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.701191 2536 log.go:25] "Validated CRI v1 image API" Mar 18 16:44:37.702375 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.702357 2536 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 18 16:44:37.709681 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.709657 2536 fs.go:135] Filesystem UUIDs: map[49af3cef-558e-43a6-8730-d86292cff7de:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 b35eaf24-51d3-4361-aed9-4fce38a0373a:/dev/nvme0n1p3] Mar 18 16:44:37.709756 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.709680 2536 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Mar 18 16:44:37.716462 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.716441 2536 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-pj5gh" Mar 18 16:44:37.716822 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.716707 2536 manager.go:217] Machine: {Timestamp:2026-03-18 16:44:37.714558617 +0000 UTC m=+0.488133407 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3108598 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2894430b7959f0f1b454158f691553 SystemUUID:ec289443-0b79-59f0-f1b4-54158f691553 BootID:22877139-6ca9-4673-8446-5655f6e2875a Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6094848 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:4c:c8:6e:1e:db Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:4c:c8:6e:1e:db Speed:0 Mtu:9001} {Name:ovs-system MacAddress:5a:8a:38:cf:fc:d3 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 18 16:44:37.716822 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.716820 2536 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 18 16:44:37.716953 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.716942 2536 manager.go:233] Version: {KernelVersion:5.14.0-570.96.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260303-1 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 18 16:44:37.717278 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.717248 2536 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 18 16:44:37.717438 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.717283 2536 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-132-224.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 18 16:44:37.717480 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.717449 2536 topology_manager.go:138] "Creating topology manager with none policy" Mar 18 16:44:37.717480 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.717458 2536 container_manager_linux.go:306] "Creating device plugin manager" Mar 18 16:44:37.717480 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.717472 2536 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 16:44:37.718235 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.718225 2536 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 16:44:37.719028 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.719018 2536 state_mem.go:36] "Initialized new in-memory state store" Mar 18 16:44:37.719131 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.719123 2536 server.go:1267] "Using root directory" path="/var/lib/kubelet" Mar 18 16:44:37.721682 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.721672 2536 kubelet.go:491] "Attempting to sync node with API server" Mar 18 16:44:37.721721 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.721686 2536 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 18 16:44:37.721721 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.721698 2536 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 18 16:44:37.721721 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.721708 2536 kubelet.go:397] "Adding apiserver pod source" Mar 18 16:44:37.721721 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.721717 2536 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 18 16:44:37.723099 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.723088 2536 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Mar 18 16:44:37.723147 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.723110 2536 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Mar 18 16:44:37.724698 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.724682 2536 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-pj5gh" Mar 18 16:44:37.726527 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.726510 2536 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.9-3.rhaos4.20.gitb9ac835.el9" apiVersion="v1" Mar 18 16:44:37.728181 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.728164 2536 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 18 16:44:37.729743 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.729729 2536 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 18 16:44:37.729811 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.729749 2536 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 18 16:44:37.729811 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.729758 2536 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 18 16:44:37.729811 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.729768 2536 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 18 16:44:37.729811 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.729777 2536 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 18 16:44:37.729811 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.729786 2536 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 18 16:44:37.729811 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.729795 2536 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 18 16:44:37.729811 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.729804 2536 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 18 16:44:37.729811 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.729814 2536 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 18 16:44:37.730053 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.729825 2536 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 18 16:44:37.730053 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.729838 2536 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 18 16:44:37.730053 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.729851 2536 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 18 16:44:37.731216 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.731204 2536 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 18 16:44:37.731289 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.731218 2536 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Mar 18 16:44:37.734605 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.734590 2536 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:37.734721 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.734708 2536 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 18 16:44:37.734783 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.734753 2536 server.go:1295] "Started kubelet" Mar 18 16:44:37.734827 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.734805 2536 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 18 16:44:37.735305 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.735251 2536 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 18 16:44:37.735359 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.735323 2536 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 18 16:44:37.735649 ip-10-0-132-224 systemd[1]: Started Kubernetes Kubelet. Mar 18 16:44:37.736466 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.736450 2536 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 18 16:44:37.736790 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.736773 2536 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:37.739126 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.739104 2536 server.go:317] "Adding debug handlers to kubelet server" Mar 18 16:44:37.740591 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.740575 2536 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-132-224.ec2.internal" not found Mar 18 16:44:37.743908 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.743697 2536 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Mar 18 16:44:37.744397 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.744221 2536 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 18 16:44:37.745064 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.744914 2536 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Mar 18 16:44:37.745064 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.744924 2536 volume_manager.go:295] "The desired_state_of_world populator starts" Mar 18 16:44:37.745064 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.745065 2536 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 18 16:44:37.745229 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.745202 2536 reconstruct.go:97] "Volume reconstruction finished" Mar 18 16:44:37.745229 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.745209 2536 reconciler.go:26] "Reconciler: start to sync state" Mar 18 16:44:37.745536 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.745523 2536 factory.go:55] Registering systemd factory Mar 18 16:44:37.745592 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.745561 2536 factory.go:223] Registration of the systemd container factory successfully Mar 18 16:44:37.745838 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.745823 2536 factory.go:153] Registering CRI-O factory Mar 18 16:44:37.745927 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.745840 2536 factory.go:223] Registration of the crio container factory successfully Mar 18 16:44:37.745983 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.745956 2536 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 18 16:44:37.745983 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.745983 2536 factory.go:103] Registering Raw factory Mar 18 16:44:37.746085 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.745999 2536 manager.go:1196] Started watching for new ooms in manager Mar 18 16:44:37.746232 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:37.746207 2536 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-224.ec2.internal\" not found" Mar 18 16:44:37.747062 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.747039 2536 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:37.747384 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.746574 2536 manager.go:319] Starting recovery of all containers Mar 18 16:44:37.750166 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:37.750138 2536 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-132-224.ec2.internal\" not found" node="ip-10-0-132-224.ec2.internal" Mar 18 16:44:37.756679 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.756662 2536 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-132-224.ec2.internal" not found Mar 18 16:44:37.757107 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.757093 2536 manager.go:324] Recovery completed Mar 18 16:44:37.761357 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.761344 2536 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:37.763075 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.763059 2536 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-224.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:37.763152 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.763098 2536 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-224.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:37.763152 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.763113 2536 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-224.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:37.763631 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.763618 2536 cpu_manager.go:222] "Starting CPU manager" policy="none" Mar 18 16:44:37.763631 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.763630 2536 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Mar 18 16:44:37.763721 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.763648 2536 state_mem.go:36] "Initialized new in-memory state store" Mar 18 16:44:37.764868 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.764857 2536 policy_none.go:49] "None policy: Start" Mar 18 16:44:37.764909 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.764872 2536 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 18 16:44:37.764909 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.764883 2536 state_mem.go:35] "Initializing new in-memory state store" Mar 18 16:44:37.818459 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.805879 2536 manager.go:341] "Starting Device Plugin manager" Mar 18 16:44:37.818459 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:37.805915 2536 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 18 16:44:37.818459 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.805926 2536 server.go:85] "Starting device plugin registration server" Mar 18 16:44:37.818459 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.806167 2536 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 18 16:44:37.818459 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.806183 2536 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 18 16:44:37.818459 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.806242 2536 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 18 16:44:37.818459 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.806353 2536 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 18 16:44:37.818459 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.806362 2536 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 18 16:44:37.818459 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:37.806855 2536 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Mar 18 16:44:37.818459 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:37.806887 2536 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-132-224.ec2.internal\" not found" Mar 18 16:44:37.818459 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.815191 2536 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-132-224.ec2.internal" not found Mar 18 16:44:37.903929 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.903834 2536 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Mar 18 16:44:37.905062 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.905046 2536 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Mar 18 16:44:37.905133 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.905073 2536 status_manager.go:230] "Starting to sync pod status with apiserver" Mar 18 16:44:37.905133 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.905092 2536 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 18 16:44:37.905133 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.905099 2536 kubelet.go:2451] "Starting kubelet main sync loop" Mar 18 16:44:37.905282 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:37.905134 2536 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Mar 18 16:44:37.906302 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.906287 2536 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:37.907346 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.907325 2536 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-224.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:37.907346 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.907358 2536 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-224.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:37.907504 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.907369 2536 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-224.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:37.907504 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.907391 2536 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-132-224.ec2.internal" Mar 18 16:44:37.908581 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.908563 2536 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:37.916396 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:37.916379 2536 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-132-224.ec2.internal" Mar 18 16:44:37.916459 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:37.916405 2536 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-132-224.ec2.internal\": node \"ip-10-0-132-224.ec2.internal\" not found" Mar 18 16:44:38.005948 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.005906 2536 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-224.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-132-224.ec2.internal"] Mar 18 16:44:38.008633 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.008618 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-224.ec2.internal" Mar 18 16:44:38.008704 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.008629 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-224.ec2.internal" Mar 18 16:44:38.033921 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.033904 2536 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-224.ec2.internal" Mar 18 16:44:38.038264 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.038246 2536 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-224.ec2.internal" Mar 18 16:44:38.046457 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.046440 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3fe312c662a48946cb6ecc07ea171293-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-224.ec2.internal\" (UID: \"3fe312c662a48946cb6ecc07ea171293\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-224.ec2.internal" Mar 18 16:44:38.046543 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.046470 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3fe312c662a48946cb6ecc07ea171293-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-224.ec2.internal\" (UID: \"3fe312c662a48946cb6ecc07ea171293\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-224.ec2.internal" Mar 18 16:44:38.046543 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.046498 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7a6f0eddc0950f127c5c3c1035cc71a3-config\") pod \"kube-apiserver-proxy-ip-10-0-132-224.ec2.internal\" (UID: \"7a6f0eddc0950f127c5c3c1035cc71a3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-224.ec2.internal" Mar 18 16:44:38.046699 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.046683 2536 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 18 16:44:38.050367 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.050352 2536 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 18 16:44:38.147664 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.147636 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3fe312c662a48946cb6ecc07ea171293-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-224.ec2.internal\" (UID: \"3fe312c662a48946cb6ecc07ea171293\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-224.ec2.internal" Mar 18 16:44:38.147664 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.147666 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3fe312c662a48946cb6ecc07ea171293-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-224.ec2.internal\" (UID: \"3fe312c662a48946cb6ecc07ea171293\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-224.ec2.internal" Mar 18 16:44:38.147852 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.147687 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7a6f0eddc0950f127c5c3c1035cc71a3-config\") pod \"kube-apiserver-proxy-ip-10-0-132-224.ec2.internal\" (UID: \"7a6f0eddc0950f127c5c3c1035cc71a3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-224.ec2.internal" Mar 18 16:44:38.147852 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.147782 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3fe312c662a48946cb6ecc07ea171293-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-224.ec2.internal\" (UID: \"3fe312c662a48946cb6ecc07ea171293\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-224.ec2.internal" Mar 18 16:44:38.147852 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.147808 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3fe312c662a48946cb6ecc07ea171293-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-224.ec2.internal\" (UID: \"3fe312c662a48946cb6ecc07ea171293\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-224.ec2.internal" Mar 18 16:44:38.147852 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.147836 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7a6f0eddc0950f127c5c3c1035cc71a3-config\") pod \"kube-apiserver-proxy-ip-10-0-132-224.ec2.internal\" (UID: \"7a6f0eddc0950f127c5c3c1035cc71a3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-224.ec2.internal" Mar 18 16:44:38.350052 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.349978 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-224.ec2.internal" Mar 18 16:44:38.352224 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.352208 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-224.ec2.internal" Mar 18 16:44:38.636662 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.636630 2536 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 18 16:44:38.637333 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.636787 2536 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Mar 18 16:44:38.637333 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.636797 2536 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Mar 18 16:44:38.637333 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.636805 2536 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Mar 18 16:44:38.722845 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.722817 2536 apiserver.go:52] "Watching apiserver" Mar 18 16:44:38.726017 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.725984 2536 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-03-17 16:39:37 +0000 UTC" deadline="2027-11-03 05:39:56.767775937 +0000 UTC" Mar 18 16:44:38.726017 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.726013 2536 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14268h55m18.041765779s" Mar 18 16:44:38.728448 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.728430 2536 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Mar 18 16:44:38.730928 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.730903 2536 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-74479","openshift-network-diagnostics/network-check-target-cr96r","kube-system/konnectivity-agent-pj5bn","openshift-cluster-node-tuning-operator/tuned-85jj7","openshift-image-registry/node-ca-jnvc6","openshift-multus/multus-additional-cni-plugins-d7xhk","openshift-multus/network-metrics-daemon-rjx6m","openshift-network-operator/iptables-alerter-5ms7r","openshift-ovn-kubernetes/ovnkube-node-ts4mw","kube-system/global-pull-secret-syncer-f52ql","kube-system/kube-apiserver-proxy-ip-10-0-132-224.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8pqnx","openshift-dns/node-resolver-9fw2f","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-224.ec2.internal"] Mar 18 16:44:38.732842 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.732821 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rjx6m" Mar 18 16:44:38.732949 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:38.732883 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rjx6m" podUID="275a2fa6-277f-40dc-a2bc-749a97550e2e" Mar 18 16:44:38.734985 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.734964 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8pqnx" Mar 18 16:44:38.736347 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.736326 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-pj5bn" Mar 18 16:44:38.736440 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.736393 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.737461 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.737307 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Mar 18 16:44:38.737461 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.737320 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Mar 18 16:44:38.737461 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.737327 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Mar 18 16:44:38.737461 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.737453 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-czlpq\"" Mar 18 16:44:38.738480 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.738072 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jnvc6" Mar 18 16:44:38.738638 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.738618 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Mar 18 16:44:38.738996 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.738976 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-gt48b\"" Mar 18 16:44:38.739087 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.738998 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Mar 18 16:44:38.739883 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.739452 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Mar 18 16:44:38.739883 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.739579 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:44:38.739883 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.739599 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-srxh7\"" Mar 18 16:44:38.740090 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.739908 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Mar 18 16:44:38.740429 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.740410 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-pxc2z\"" Mar 18 16:44:38.740672 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.740652 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Mar 18 16:44:38.740742 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.740729 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Mar 18 16:44:38.741389 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.741371 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5ms7r" Mar 18 16:44:38.743337 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.743317 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Mar 18 16:44:38.743505 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.743348 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:44:38.743505 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.743382 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Mar 18 16:44:38.743505 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.743353 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-mcmfr\"" Mar 18 16:44:38.743505 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.743469 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.743704 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.743552 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-74479" Mar 18 16:44:38.743910 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.743889 2536 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Mar 18 16:44:38.745336 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.745067 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9fw2f" Mar 18 16:44:38.745336 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.745121 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Mar 18 16:44:38.745479 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.745429 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Mar 18 16:44:38.745713 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.745696 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-bc7nv\"" Mar 18 16:44:38.745807 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.745733 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Mar 18 16:44:38.745807 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.745758 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Mar 18 16:44:38.746106 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.746091 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Mar 18 16:44:38.746183 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.746167 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Mar 18 16:44:38.746242 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.746211 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Mar 18 16:44:38.746307 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.746251 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Mar 18 16:44:38.746860 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.746449 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Mar 18 16:44:38.746860 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.746574 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Mar 18 16:44:38.746860 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.746653 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-hxg5b\"" Mar 18 16:44:38.746860 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.746676 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f52ql" Mar 18 16:44:38.746860 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:38.746739 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f52ql" podUID="b998ab02-161a-40e9-9e53-7183a98152de" Mar 18 16:44:38.747052 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.746938 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-67h2l\"" Mar 18 16:44:38.747052 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.747023 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Mar 18 16:44:38.747171 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.747154 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Mar 18 16:44:38.748096 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.748074 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr96r" Mar 18 16:44:38.748180 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:38.748123 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr96r" podUID="440da786-0ff1-4727-bd36-e32a3acc5a3c" Mar 18 16:44:38.749544 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.749529 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-d7xhk" Mar 18 16:44:38.750627 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.750607 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-host-run-ovn-kubernetes\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.750703 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.750639 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zp9d\" (UniqueName: \"kubernetes.io/projected/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-kube-api-access-8zp9d\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.750703 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.750659 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c13c7653-cb9a-48c0-a38e-8c050869e5a0-registration-dir\") pod \"aws-ebs-csi-driver-node-8pqnx\" (UID: \"c13c7653-cb9a-48c0-a38e-8c050869e5a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8pqnx" Mar 18 16:44:38.750703 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.750674 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcnnk\" (UniqueName: \"kubernetes.io/projected/c13c7653-cb9a-48c0-a38e-8c050869e5a0-kube-api-access-mcnnk\") pod \"aws-ebs-csi-driver-node-8pqnx\" (UID: \"c13c7653-cb9a-48c0-a38e-8c050869e5a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8pqnx" Mar 18 16:44:38.750703 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.750689 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0763006-cf97-41dc-a668-a271612d9705-etc-kubernetes\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.750870 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.750705 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b0763006-cf97-41dc-a668-a271612d9705-tmp\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.750870 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.750722 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-hostroot\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.750870 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.750774 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c13c7653-cb9a-48c0-a38e-8c050869e5a0-etc-selinux\") pod \"aws-ebs-csi-driver-node-8pqnx\" (UID: \"c13c7653-cb9a-48c0-a38e-8c050869e5a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8pqnx" Mar 18 16:44:38.750870 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.750810 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9464f832-e8a5-4469-9d2d-db0f0f547d86-host-slash\") pod \"iptables-alerter-5ms7r\" (UID: \"9464f832-e8a5-4469-9d2d-db0f0f547d86\") " pod="openshift-network-operator/iptables-alerter-5ms7r" Mar 18 16:44:38.750870 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.750837 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b0763006-cf97-41dc-a668-a271612d9705-etc-modprobe-d\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.751052 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.750872 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-host-var-lib-cni-bin\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.751052 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.750913 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9464f832-e8a5-4469-9d2d-db0f0f547d86-iptables-alerter-script\") pod \"iptables-alerter-5ms7r\" (UID: \"9464f832-e8a5-4469-9d2d-db0f0f547d86\") " pod="openshift-network-operator/iptables-alerter-5ms7r" Mar 18 16:44:38.751052 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.750963 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-etc-openvswitch\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.751052 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.750988 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-log-socket\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.751052 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.751012 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-multus-conf-dir\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.751052 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.751036 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c87e1483-58f2-438b-af86-c607ffcbf01c-konnectivity-ca\") pod \"konnectivity-agent-pj5bn\" (UID: \"c87e1483-58f2-438b-af86-c607ffcbf01c\") " pod="kube-system/konnectivity-agent-pj5bn" Mar 18 16:44:38.751256 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.751078 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b0763006-cf97-41dc-a668-a271612d9705-sys\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.751256 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.751102 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b0763006-cf97-41dc-a668-a271612d9705-host\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.751256 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.751143 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-systemd-units\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.751256 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.751158 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-cnibin\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.751256 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.751172 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c68d2b91-1efd-47b8-93dc-98606a96920b-cni-binary-copy\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.751256 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.751191 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/37693aba-cef0-4f5a-a523-bd82dbff0143-tmp-dir\") pod \"node-resolver-9fw2f\" (UID: \"37693aba-cef0-4f5a-a523-bd82dbff0143\") " pod="openshift-dns/node-resolver-9fw2f" Mar 18 16:44:38.751256 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.751207 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/275a2fa6-277f-40dc-a2bc-749a97550e2e-metrics-certs\") pod \"network-metrics-daemon-rjx6m\" (UID: \"275a2fa6-277f-40dc-a2bc-749a97550e2e\") " pod="openshift-multus/network-metrics-daemon-rjx6m" Mar 18 16:44:38.751256 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.751248 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9n2p\" (UniqueName: \"kubernetes.io/projected/275a2fa6-277f-40dc-a2bc-749a97550e2e-kube-api-access-q9n2p\") pod \"network-metrics-daemon-rjx6m\" (UID: \"275a2fa6-277f-40dc-a2bc-749a97550e2e\") " pod="openshift-multus/network-metrics-daemon-rjx6m" Mar 18 16:44:38.751701 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.751315 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Mar 18 16:44:38.751701 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.751330 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-dzv4d\"" Mar 18 16:44:38.751701 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.751369 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Mar 18 16:44:38.751701 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.751299 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a421b67e-f253-48e4-b5a3-4a895c3bf6d2-serviceca\") pod \"node-ca-jnvc6\" (UID: \"a421b67e-f253-48e4-b5a3-4a895c3bf6d2\") " pod="openshift-image-registry/node-ca-jnvc6" Mar 18 16:44:38.751701 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.751429 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqdgg\" (UniqueName: \"kubernetes.io/projected/a421b67e-f253-48e4-b5a3-4a895c3bf6d2-kube-api-access-cqdgg\") pod \"node-ca-jnvc6\" (UID: \"a421b67e-f253-48e4-b5a3-4a895c3bf6d2\") " pod="openshift-image-registry/node-ca-jnvc6" Mar 18 16:44:38.751701 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.751449 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-ovnkube-config\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.751701 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.751472 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cc8l\" (UniqueName: \"kubernetes.io/projected/c68d2b91-1efd-47b8-93dc-98606a96920b-kube-api-access-4cc8l\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.751701 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.751516 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b0763006-cf97-41dc-a668-a271612d9705-lib-modules\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.751701 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.751553 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-host-cni-netd\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.751701 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.751578 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c13c7653-cb9a-48c0-a38e-8c050869e5a0-socket-dir\") pod \"aws-ebs-csi-driver-node-8pqnx\" (UID: \"c13c7653-cb9a-48c0-a38e-8c050869e5a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8pqnx" Mar 18 16:44:38.751701 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.751610 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/37693aba-cef0-4f5a-a523-bd82dbff0143-hosts-file\") pod \"node-resolver-9fw2f\" (UID: \"37693aba-cef0-4f5a-a523-bd82dbff0143\") " pod="openshift-dns/node-resolver-9fw2f" Mar 18 16:44:38.751701 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.751632 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-run-systemd\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.751701 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.751655 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-multus-socket-dir-parent\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.751701 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.751679 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-host-run-netns\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.751701 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.751703 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-host-var-lib-cni-multus\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.752176 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.751726 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ddvm\" (UniqueName: \"kubernetes.io/projected/37693aba-cef0-4f5a-a523-bd82dbff0143-kube-api-access-7ddvm\") pod \"node-resolver-9fw2f\" (UID: \"37693aba-cef0-4f5a-a523-bd82dbff0143\") " pod="openshift-dns/node-resolver-9fw2f" Mar 18 16:44:38.752176 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.751748 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b0763006-cf97-41dc-a668-a271612d9705-run\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.752176 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.751782 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-host-var-lib-kubelet\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.752176 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.751812 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c13c7653-cb9a-48c0-a38e-8c050869e5a0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8pqnx\" (UID: \"c13c7653-cb9a-48c0-a38e-8c050869e5a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8pqnx" Mar 18 16:44:38.752176 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.751839 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b0763006-cf97-41dc-a668-a271612d9705-etc-tuned\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.752176 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.751862 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-var-lib-openvswitch\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.752176 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.751879 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-env-overrides\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.752176 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.751894 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c13c7653-cb9a-48c0-a38e-8c050869e5a0-device-dir\") pod \"aws-ebs-csi-driver-node-8pqnx\" (UID: \"c13c7653-cb9a-48c0-a38e-8c050869e5a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8pqnx" Mar 18 16:44:38.752176 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.751907 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b0763006-cf97-41dc-a668-a271612d9705-etc-sysctl-d\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.752176 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.751920 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-run-openvswitch\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.752176 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.751945 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.752176 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.751969 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-ovn-node-metrics-cert\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.752176 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.751985 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b0763006-cf97-41dc-a668-a271612d9705-etc-sysctl-conf\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.752176 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.751999 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7p7j\" (UniqueName: \"kubernetes.io/projected/b0763006-cf97-41dc-a668-a271612d9705-kube-api-access-q7p7j\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.752176 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.752019 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-run-ovn\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.752176 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.752060 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b0763006-cf97-41dc-a668-a271612d9705-etc-systemd\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.752842 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.752099 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-host-run-netns\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.752842 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.752136 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-host-run-multus-certs\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.752842 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.752206 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b998ab02-161a-40e9-9e53-7183a98152de-kubelet-config\") pod \"global-pull-secret-syncer-f52ql\" (UID: \"b998ab02-161a-40e9-9e53-7183a98152de\") " pod="kube-system/global-pull-secret-syncer-f52ql" Mar 18 16:44:38.752842 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.752237 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b0763006-cf97-41dc-a668-a271612d9705-var-lib-kubelet\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.752842 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.752258 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-node-log\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.752842 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.752289 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-host-cni-bin\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.752842 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.752335 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-etc-kubernetes\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.752842 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.752375 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c87e1483-58f2-438b-af86-c607ffcbf01c-agent-certs\") pod \"konnectivity-agent-pj5bn\" (UID: \"c87e1483-58f2-438b-af86-c607ffcbf01c\") " pod="kube-system/konnectivity-agent-pj5bn" Mar 18 16:44:38.752842 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.752401 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-host-slash\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.752842 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.752426 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-host-run-k8s-cni-cncf-io\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.752842 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.752461 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c68d2b91-1efd-47b8-93dc-98606a96920b-multus-daemon-config\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.752842 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.752485 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c13c7653-cb9a-48c0-a38e-8c050869e5a0-sys-fs\") pod \"aws-ebs-csi-driver-node-8pqnx\" (UID: \"c13c7653-cb9a-48c0-a38e-8c050869e5a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8pqnx" Mar 18 16:44:38.752842 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.752514 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr9lj\" (UniqueName: \"kubernetes.io/projected/9464f832-e8a5-4469-9d2d-db0f0f547d86-kube-api-access-hr9lj\") pod \"iptables-alerter-5ms7r\" (UID: \"9464f832-e8a5-4469-9d2d-db0f0f547d86\") " pod="openshift-network-operator/iptables-alerter-5ms7r" Mar 18 16:44:38.752842 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.752545 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b998ab02-161a-40e9-9e53-7183a98152de-original-pull-secret\") pod \"global-pull-secret-syncer-f52ql\" (UID: \"b998ab02-161a-40e9-9e53-7183a98152de\") " pod="kube-system/global-pull-secret-syncer-f52ql" Mar 18 16:44:38.752842 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.752577 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-ovnkube-script-lib\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.752842 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.752605 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-system-cni-dir\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.753308 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.752631 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-multus-cni-dir\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.753308 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.752656 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-os-release\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.753308 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.752678 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b998ab02-161a-40e9-9e53-7183a98152de-dbus\") pod \"global-pull-secret-syncer-f52ql\" (UID: \"b998ab02-161a-40e9-9e53-7183a98152de\") " pod="kube-system/global-pull-secret-syncer-f52ql" Mar 18 16:44:38.753308 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.752702 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b0763006-cf97-41dc-a668-a271612d9705-etc-sysconfig\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.753308 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.752724 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a421b67e-f253-48e4-b5a3-4a895c3bf6d2-host\") pod \"node-ca-jnvc6\" (UID: \"a421b67e-f253-48e4-b5a3-4a895c3bf6d2\") " pod="openshift-image-registry/node-ca-jnvc6" Mar 18 16:44:38.753308 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.752749 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-host-kubelet\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.753308 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.752956 2536 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Mar 18 16:44:38.780298 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.780257 2536 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-8mpwq" Mar 18 16:44:38.786340 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.786321 2536 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-8mpwq" Mar 18 16:44:38.846599 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.846576 2536 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Mar 18 16:44:38.853085 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.853030 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-host-run-k8s-cni-cncf-io\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.853085 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.853074 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c68d2b91-1efd-47b8-93dc-98606a96920b-multus-daemon-config\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.853238 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.853107 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c13c7653-cb9a-48c0-a38e-8c050869e5a0-sys-fs\") pod \"aws-ebs-csi-driver-node-8pqnx\" (UID: \"c13c7653-cb9a-48c0-a38e-8c050869e5a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8pqnx" Mar 18 16:44:38.853238 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.853132 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hr9lj\" (UniqueName: \"kubernetes.io/projected/9464f832-e8a5-4469-9d2d-db0f0f547d86-kube-api-access-hr9lj\") pod \"iptables-alerter-5ms7r\" (UID: \"9464f832-e8a5-4469-9d2d-db0f0f547d86\") " pod="openshift-network-operator/iptables-alerter-5ms7r" Mar 18 16:44:38.853238 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.853136 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-host-run-k8s-cni-cncf-io\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.853238 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.853157 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b998ab02-161a-40e9-9e53-7183a98152de-original-pull-secret\") pod \"global-pull-secret-syncer-f52ql\" (UID: \"b998ab02-161a-40e9-9e53-7183a98152de\") " pod="kube-system/global-pull-secret-syncer-f52ql" Mar 18 16:44:38.853238 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.853202 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-ovnkube-script-lib\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.853476 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.853246 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-system-cni-dir\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.853476 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.853287 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c13c7653-cb9a-48c0-a38e-8c050869e5a0-sys-fs\") pod \"aws-ebs-csi-driver-node-8pqnx\" (UID: \"c13c7653-cb9a-48c0-a38e-8c050869e5a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8pqnx" Mar 18 16:44:38.853476 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.853290 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-multus-cni-dir\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.853476 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:38.853311 2536 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:38.853476 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.853329 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-os-release\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.853476 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.853348 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-multus-cni-dir\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.853476 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.853360 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/96282abf-ce09-4b33-baaf-73f9c5329541-cnibin\") pod \"multus-additional-cni-plugins-d7xhk\" (UID: \"96282abf-ce09-4b33-baaf-73f9c5329541\") " pod="openshift-multus/multus-additional-cni-plugins-d7xhk" Mar 18 16:44:38.853476 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.853417 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-os-release\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.853476 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.853457 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-system-cni-dir\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.853845 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:38.853573 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b998ab02-161a-40e9-9e53-7183a98152de-original-pull-secret podName:b998ab02-161a-40e9-9e53-7183a98152de nodeName:}" failed. No retries permitted until 2026-03-18 16:44:39.353369221 +0000 UTC m=+2.126943790 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b998ab02-161a-40e9-9e53-7183a98152de-original-pull-secret") pod "global-pull-secret-syncer-f52ql" (UID: "b998ab02-161a-40e9-9e53-7183a98152de") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:38.853845 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.853632 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b998ab02-161a-40e9-9e53-7183a98152de-dbus\") pod \"global-pull-secret-syncer-f52ql\" (UID: \"b998ab02-161a-40e9-9e53-7183a98152de\") " pod="kube-system/global-pull-secret-syncer-f52ql" Mar 18 16:44:38.853845 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.853639 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c68d2b91-1efd-47b8-93dc-98606a96920b-multus-daemon-config\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.853845 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.853661 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b0763006-cf97-41dc-a668-a271612d9705-etc-sysconfig\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.853845 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.853689 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a421b67e-f253-48e4-b5a3-4a895c3bf6d2-host\") pod \"node-ca-jnvc6\" (UID: \"a421b67e-f253-48e4-b5a3-4a895c3bf6d2\") " pod="openshift-image-registry/node-ca-jnvc6" Mar 18 16:44:38.853845 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.853705 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-host-kubelet\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.853845 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.853720 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-host-run-ovn-kubernetes\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.853845 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.853738 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8zp9d\" (UniqueName: \"kubernetes.io/projected/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-kube-api-access-8zp9d\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.853845 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.853783 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b998ab02-161a-40e9-9e53-7183a98152de-dbus\") pod \"global-pull-secret-syncer-f52ql\" (UID: \"b998ab02-161a-40e9-9e53-7183a98152de\") " pod="kube-system/global-pull-secret-syncer-f52ql" Mar 18 16:44:38.853845 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.853784 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-ovnkube-script-lib\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.854145 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.853943 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-host-kubelet\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.854145 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854000 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b0763006-cf97-41dc-a668-a271612d9705-etc-sysconfig\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.854145 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854029 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/96282abf-ce09-4b33-baaf-73f9c5329541-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d7xhk\" (UID: \"96282abf-ce09-4b33-baaf-73f9c5329541\") " pod="openshift-multus/multus-additional-cni-plugins-d7xhk" Mar 18 16:44:38.854145 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854050 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c13c7653-cb9a-48c0-a38e-8c050869e5a0-registration-dir\") pod \"aws-ebs-csi-driver-node-8pqnx\" (UID: \"c13c7653-cb9a-48c0-a38e-8c050869e5a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8pqnx" Mar 18 16:44:38.854145 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854087 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mcnnk\" (UniqueName: \"kubernetes.io/projected/c13c7653-cb9a-48c0-a38e-8c050869e5a0-kube-api-access-mcnnk\") pod \"aws-ebs-csi-driver-node-8pqnx\" (UID: \"c13c7653-cb9a-48c0-a38e-8c050869e5a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8pqnx" Mar 18 16:44:38.854443 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854148 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-host-run-ovn-kubernetes\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.854443 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854165 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0763006-cf97-41dc-a668-a271612d9705-etc-kubernetes\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.854443 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854171 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c13c7653-cb9a-48c0-a38e-8c050869e5a0-registration-dir\") pod \"aws-ebs-csi-driver-node-8pqnx\" (UID: \"c13c7653-cb9a-48c0-a38e-8c050869e5a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8pqnx" Mar 18 16:44:38.854443 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854195 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a421b67e-f253-48e4-b5a3-4a895c3bf6d2-host\") pod \"node-ca-jnvc6\" (UID: \"a421b67e-f253-48e4-b5a3-4a895c3bf6d2\") " pod="openshift-image-registry/node-ca-jnvc6" Mar 18 16:44:38.854443 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854196 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b0763006-cf97-41dc-a668-a271612d9705-tmp\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.854443 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854235 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-hostroot\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.854443 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854253 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0763006-cf97-41dc-a668-a271612d9705-etc-kubernetes\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.854443 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854260 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c13c7653-cb9a-48c0-a38e-8c050869e5a0-etc-selinux\") pod \"aws-ebs-csi-driver-node-8pqnx\" (UID: \"c13c7653-cb9a-48c0-a38e-8c050869e5a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8pqnx" Mar 18 16:44:38.854443 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854257 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-hostroot\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.854443 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854312 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9464f832-e8a5-4469-9d2d-db0f0f547d86-host-slash\") pod \"iptables-alerter-5ms7r\" (UID: \"9464f832-e8a5-4469-9d2d-db0f0f547d86\") " pod="openshift-network-operator/iptables-alerter-5ms7r" Mar 18 16:44:38.854443 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854342 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b0763006-cf97-41dc-a668-a271612d9705-etc-modprobe-d\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.854443 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854330 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c13c7653-cb9a-48c0-a38e-8c050869e5a0-etc-selinux\") pod \"aws-ebs-csi-driver-node-8pqnx\" (UID: \"c13c7653-cb9a-48c0-a38e-8c050869e5a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8pqnx" Mar 18 16:44:38.854443 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854378 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9464f832-e8a5-4469-9d2d-db0f0f547d86-host-slash\") pod \"iptables-alerter-5ms7r\" (UID: \"9464f832-e8a5-4469-9d2d-db0f0f547d86\") " pod="openshift-network-operator/iptables-alerter-5ms7r" Mar 18 16:44:38.854443 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854423 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-host-var-lib-cni-bin\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.855070 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854459 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9464f832-e8a5-4469-9d2d-db0f0f547d86-iptables-alerter-script\") pod \"iptables-alerter-5ms7r\" (UID: \"9464f832-e8a5-4469-9d2d-db0f0f547d86\") " pod="openshift-network-operator/iptables-alerter-5ms7r" Mar 18 16:44:38.855070 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854464 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b0763006-cf97-41dc-a668-a271612d9705-etc-modprobe-d\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.855070 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854493 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-etc-openvswitch\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.855070 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854519 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-log-socket\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.855070 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854498 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-host-var-lib-cni-bin\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.855070 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854526 2536 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 18 16:44:38.855070 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854524 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-etc-openvswitch\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.855070 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854541 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-multus-conf-dir\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.855070 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854566 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-multus-conf-dir\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.855070 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854569 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c87e1483-58f2-438b-af86-c607ffcbf01c-konnectivity-ca\") pod \"konnectivity-agent-pj5bn\" (UID: \"c87e1483-58f2-438b-af86-c607ffcbf01c\") " pod="kube-system/konnectivity-agent-pj5bn" Mar 18 16:44:38.855070 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854610 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-log-socket\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.855070 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854632 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b0763006-cf97-41dc-a668-a271612d9705-sys\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.855070 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854658 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b0763006-cf97-41dc-a668-a271612d9705-host\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.855070 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854682 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-systemd-units\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.855070 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854705 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-cnibin\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.855070 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854716 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b0763006-cf97-41dc-a668-a271612d9705-sys\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.855070 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854729 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c68d2b91-1efd-47b8-93dc-98606a96920b-cni-binary-copy\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.855070 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854754 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-systemd-units\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.855983 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854736 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b0763006-cf97-41dc-a668-a271612d9705-host\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.855983 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854761 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/96282abf-ce09-4b33-baaf-73f9c5329541-system-cni-dir\") pod \"multus-additional-cni-plugins-d7xhk\" (UID: \"96282abf-ce09-4b33-baaf-73f9c5329541\") " pod="openshift-multus/multus-additional-cni-plugins-d7xhk" Mar 18 16:44:38.855983 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854799 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/37693aba-cef0-4f5a-a523-bd82dbff0143-tmp-dir\") pod \"node-resolver-9fw2f\" (UID: \"37693aba-cef0-4f5a-a523-bd82dbff0143\") " pod="openshift-dns/node-resolver-9fw2f" Mar 18 16:44:38.855983 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854802 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-cnibin\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.855983 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854823 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/275a2fa6-277f-40dc-a2bc-749a97550e2e-metrics-certs\") pod \"network-metrics-daemon-rjx6m\" (UID: \"275a2fa6-277f-40dc-a2bc-749a97550e2e\") " pod="openshift-multus/network-metrics-daemon-rjx6m" Mar 18 16:44:38.855983 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854847 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q9n2p\" (UniqueName: \"kubernetes.io/projected/275a2fa6-277f-40dc-a2bc-749a97550e2e-kube-api-access-q9n2p\") pod \"network-metrics-daemon-rjx6m\" (UID: \"275a2fa6-277f-40dc-a2bc-749a97550e2e\") " pod="openshift-multus/network-metrics-daemon-rjx6m" Mar 18 16:44:38.855983 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854872 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a421b67e-f253-48e4-b5a3-4a895c3bf6d2-serviceca\") pod \"node-ca-jnvc6\" (UID: \"a421b67e-f253-48e4-b5a3-4a895c3bf6d2\") " pod="openshift-image-registry/node-ca-jnvc6" Mar 18 16:44:38.855983 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854896 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqdgg\" (UniqueName: \"kubernetes.io/projected/a421b67e-f253-48e4-b5a3-4a895c3bf6d2-kube-api-access-cqdgg\") pod \"node-ca-jnvc6\" (UID: \"a421b67e-f253-48e4-b5a3-4a895c3bf6d2\") " pod="openshift-image-registry/node-ca-jnvc6" Mar 18 16:44:38.855983 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854918 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-ovnkube-config\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.855983 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854943 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4cc8l\" (UniqueName: \"kubernetes.io/projected/c68d2b91-1efd-47b8-93dc-98606a96920b-kube-api-access-4cc8l\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.855983 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854971 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/96282abf-ce09-4b33-baaf-73f9c5329541-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d7xhk\" (UID: \"96282abf-ce09-4b33-baaf-73f9c5329541\") " pod="openshift-multus/multus-additional-cni-plugins-d7xhk" Mar 18 16:44:38.855983 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.854976 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9464f832-e8a5-4469-9d2d-db0f0f547d86-iptables-alerter-script\") pod \"iptables-alerter-5ms7r\" (UID: \"9464f832-e8a5-4469-9d2d-db0f0f547d86\") " pod="openshift-network-operator/iptables-alerter-5ms7r" Mar 18 16:44:38.855983 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855002 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b0763006-cf97-41dc-a668-a271612d9705-lib-modules\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.855983 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855029 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-host-cni-netd\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.855983 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855058 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c13c7653-cb9a-48c0-a38e-8c050869e5a0-socket-dir\") pod \"aws-ebs-csi-driver-node-8pqnx\" (UID: \"c13c7653-cb9a-48c0-a38e-8c050869e5a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8pqnx" Mar 18 16:44:38.855983 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855059 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c87e1483-58f2-438b-af86-c607ffcbf01c-konnectivity-ca\") pod \"konnectivity-agent-pj5bn\" (UID: \"c87e1483-58f2-438b-af86-c607ffcbf01c\") " pod="kube-system/konnectivity-agent-pj5bn" Mar 18 16:44:38.855983 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855104 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/37693aba-cef0-4f5a-a523-bd82dbff0143-hosts-file\") pod \"node-resolver-9fw2f\" (UID: \"37693aba-cef0-4f5a-a523-bd82dbff0143\") " pod="openshift-dns/node-resolver-9fw2f" Mar 18 16:44:38.856899 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855128 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-run-systemd\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.856899 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:38.855157 2536 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:38.856899 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855162 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-multus-socket-dir-parent\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.856899 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855187 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/37693aba-cef0-4f5a-a523-bd82dbff0143-tmp-dir\") pod \"node-resolver-9fw2f\" (UID: \"37693aba-cef0-4f5a-a523-bd82dbff0143\") " pod="openshift-dns/node-resolver-9fw2f" Mar 18 16:44:38.856899 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:38.855213 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/275a2fa6-277f-40dc-a2bc-749a97550e2e-metrics-certs podName:275a2fa6-277f-40dc-a2bc-749a97550e2e nodeName:}" failed. No retries permitted until 2026-03-18 16:44:39.355196314 +0000 UTC m=+2.128770885 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/275a2fa6-277f-40dc-a2bc-749a97550e2e-metrics-certs") pod "network-metrics-daemon-rjx6m" (UID: "275a2fa6-277f-40dc-a2bc-749a97550e2e") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:38.856899 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855230 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-multus-socket-dir-parent\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.856899 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855249 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c68d2b91-1efd-47b8-93dc-98606a96920b-cni-binary-copy\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.856899 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855285 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-host-cni-netd\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.856899 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855351 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/37693aba-cef0-4f5a-a523-bd82dbff0143-hosts-file\") pod \"node-resolver-9fw2f\" (UID: \"37693aba-cef0-4f5a-a523-bd82dbff0143\") " pod="openshift-dns/node-resolver-9fw2f" Mar 18 16:44:38.856899 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855386 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b0763006-cf97-41dc-a668-a271612d9705-lib-modules\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.856899 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855400 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-run-systemd\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.856899 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855430 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-host-run-netns\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.856899 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855457 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-host-var-lib-cni-multus\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.856899 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855481 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c13c7653-cb9a-48c0-a38e-8c050869e5a0-socket-dir\") pod \"aws-ebs-csi-driver-node-8pqnx\" (UID: \"c13c7653-cb9a-48c0-a38e-8c050869e5a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8pqnx" Mar 18 16:44:38.856899 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855485 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7ddvm\" (UniqueName: \"kubernetes.io/projected/37693aba-cef0-4f5a-a523-bd82dbff0143-kube-api-access-7ddvm\") pod \"node-resolver-9fw2f\" (UID: \"37693aba-cef0-4f5a-a523-bd82dbff0143\") " pod="openshift-dns/node-resolver-9fw2f" Mar 18 16:44:38.856899 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855517 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b0763006-cf97-41dc-a668-a271612d9705-run\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.856899 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855537 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-host-run-netns\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.856899 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855542 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-host-var-lib-kubelet\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.857639 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855558 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a421b67e-f253-48e4-b5a3-4a895c3bf6d2-serviceca\") pod \"node-ca-jnvc6\" (UID: \"a421b67e-f253-48e4-b5a3-4a895c3bf6d2\") " pod="openshift-image-registry/node-ca-jnvc6" Mar 18 16:44:38.857639 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855572 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-host-var-lib-kubelet\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.857639 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855588 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/96282abf-ce09-4b33-baaf-73f9c5329541-cni-binary-copy\") pod \"multus-additional-cni-plugins-d7xhk\" (UID: \"96282abf-ce09-4b33-baaf-73f9c5329541\") " pod="openshift-multus/multus-additional-cni-plugins-d7xhk" Mar 18 16:44:38.857639 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855608 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b0763006-cf97-41dc-a668-a271612d9705-run\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.857639 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855619 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c13c7653-cb9a-48c0-a38e-8c050869e5a0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8pqnx\" (UID: \"c13c7653-cb9a-48c0-a38e-8c050869e5a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8pqnx" Mar 18 16:44:38.857639 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855648 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-host-var-lib-cni-multus\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.857639 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855655 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b0763006-cf97-41dc-a668-a271612d9705-etc-tuned\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.857639 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855694 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c13c7653-cb9a-48c0-a38e-8c050869e5a0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8pqnx\" (UID: \"c13c7653-cb9a-48c0-a38e-8c050869e5a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8pqnx" Mar 18 16:44:38.857639 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855684 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-var-lib-openvswitch\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.857639 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855717 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-ovnkube-config\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.857639 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855738 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-env-overrides\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.857639 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855776 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-var-lib-openvswitch\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.857639 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855769 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c13c7653-cb9a-48c0-a38e-8c050869e5a0-device-dir\") pod \"aws-ebs-csi-driver-node-8pqnx\" (UID: \"c13c7653-cb9a-48c0-a38e-8c050869e5a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8pqnx" Mar 18 16:44:38.857639 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855831 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b0763006-cf97-41dc-a668-a271612d9705-etc-sysctl-d\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.857639 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855867 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-run-openvswitch\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.857639 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855892 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.857639 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855910 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c13c7653-cb9a-48c0-a38e-8c050869e5a0-device-dir\") pod \"aws-ebs-csi-driver-node-8pqnx\" (UID: \"c13c7653-cb9a-48c0-a38e-8c050869e5a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8pqnx" Mar 18 16:44:38.858391 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855926 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-ovn-node-metrics-cert\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.858391 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.855964 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-run-openvswitch\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.858391 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.856026 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/96282abf-ce09-4b33-baaf-73f9c5329541-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-d7xhk\" (UID: \"96282abf-ce09-4b33-baaf-73f9c5329541\") " pod="openshift-multus/multus-additional-cni-plugins-d7xhk" Mar 18 16:44:38.858391 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.856057 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b0763006-cf97-41dc-a668-a271612d9705-etc-sysctl-conf\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.858391 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.856091 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q7p7j\" (UniqueName: \"kubernetes.io/projected/b0763006-cf97-41dc-a668-a271612d9705-kube-api-access-q7p7j\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.858391 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.856107 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.858391 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.856116 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-run-ovn\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.858391 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.856149 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/96282abf-ce09-4b33-baaf-73f9c5329541-os-release\") pod \"multus-additional-cni-plugins-d7xhk\" (UID: \"96282abf-ce09-4b33-baaf-73f9c5329541\") " pod="openshift-multus/multus-additional-cni-plugins-d7xhk" Mar 18 16:44:38.858391 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.856149 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-run-ovn\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.858391 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.856175 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gfw8\" (UniqueName: \"kubernetes.io/projected/96282abf-ce09-4b33-baaf-73f9c5329541-kube-api-access-5gfw8\") pod \"multus-additional-cni-plugins-d7xhk\" (UID: \"96282abf-ce09-4b33-baaf-73f9c5329541\") " pod="openshift-multus/multus-additional-cni-plugins-d7xhk" Mar 18 16:44:38.858391 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.856202 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b0763006-cf97-41dc-a668-a271612d9705-etc-systemd\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.858391 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.856226 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-host-run-netns\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.858391 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.856241 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b0763006-cf97-41dc-a668-a271612d9705-etc-sysctl-d\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.858391 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.856252 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-host-run-multus-certs\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.858391 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.856295 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b998ab02-161a-40e9-9e53-7183a98152de-kubelet-config\") pod \"global-pull-secret-syncer-f52ql\" (UID: \"b998ab02-161a-40e9-9e53-7183a98152de\") " pod="kube-system/global-pull-secret-syncer-f52ql" Mar 18 16:44:38.858391 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.856321 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b0763006-cf97-41dc-a668-a271612d9705-var-lib-kubelet\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.858391 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.856344 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-node-log\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.859214 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.856352 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-host-run-netns\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.859214 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.856357 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b0763006-cf97-41dc-a668-a271612d9705-etc-sysctl-conf\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.859214 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.856367 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-host-cni-bin\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.859214 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.856370 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-env-overrides\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.859214 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.856394 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-etc-kubernetes\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.859214 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.856408 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b0763006-cf97-41dc-a668-a271612d9705-etc-systemd\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.859214 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.856412 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-host-run-multus-certs\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.859214 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.856421 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fldmr\" (UniqueName: \"kubernetes.io/projected/440da786-0ff1-4727-bd36-e32a3acc5a3c-kube-api-access-fldmr\") pod \"network-check-target-cr96r\" (UID: \"440da786-0ff1-4727-bd36-e32a3acc5a3c\") " pod="openshift-network-diagnostics/network-check-target-cr96r" Mar 18 16:44:38.859214 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.856449 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c87e1483-58f2-438b-af86-c607ffcbf01c-agent-certs\") pod \"konnectivity-agent-pj5bn\" (UID: \"c87e1483-58f2-438b-af86-c607ffcbf01c\") " pod="kube-system/konnectivity-agent-pj5bn" Mar 18 16:44:38.859214 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.856453 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-node-log\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.859214 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.856461 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b998ab02-161a-40e9-9e53-7183a98152de-kubelet-config\") pod \"global-pull-secret-syncer-f52ql\" (UID: \"b998ab02-161a-40e9-9e53-7183a98152de\") " pod="kube-system/global-pull-secret-syncer-f52ql" Mar 18 16:44:38.859214 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.856466 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c68d2b91-1efd-47b8-93dc-98606a96920b-etc-kubernetes\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.859214 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.856455 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b0763006-cf97-41dc-a668-a271612d9705-var-lib-kubelet\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.859214 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.856474 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-host-slash\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.859214 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.856495 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-host-cni-bin\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.859214 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.856511 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-host-slash\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.859214 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.858028 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b0763006-cf97-41dc-a668-a271612d9705-tmp\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.859214 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.858079 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b0763006-cf97-41dc-a668-a271612d9705-etc-tuned\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.859800 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.858617 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-ovn-node-metrics-cert\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.859800 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.859022 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c87e1483-58f2-438b-af86-c607ffcbf01c-agent-certs\") pod \"konnectivity-agent-pj5bn\" (UID: \"c87e1483-58f2-438b-af86-c607ffcbf01c\") " pod="kube-system/konnectivity-agent-pj5bn" Mar 18 16:44:38.861820 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.861794 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr9lj\" (UniqueName: \"kubernetes.io/projected/9464f832-e8a5-4469-9d2d-db0f0f547d86-kube-api-access-hr9lj\") pod \"iptables-alerter-5ms7r\" (UID: \"9464f832-e8a5-4469-9d2d-db0f0f547d86\") " pod="openshift-network-operator/iptables-alerter-5ms7r" Mar 18 16:44:38.863641 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.863584 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ddvm\" (UniqueName: \"kubernetes.io/projected/37693aba-cef0-4f5a-a523-bd82dbff0143-kube-api-access-7ddvm\") pod \"node-resolver-9fw2f\" (UID: \"37693aba-cef0-4f5a-a523-bd82dbff0143\") " pod="openshift-dns/node-resolver-9fw2f" Mar 18 16:44:38.864195 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.863864 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcnnk\" (UniqueName: \"kubernetes.io/projected/c13c7653-cb9a-48c0-a38e-8c050869e5a0-kube-api-access-mcnnk\") pod \"aws-ebs-csi-driver-node-8pqnx\" (UID: \"c13c7653-cb9a-48c0-a38e-8c050869e5a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8pqnx" Mar 18 16:44:38.864195 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.864121 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9n2p\" (UniqueName: \"kubernetes.io/projected/275a2fa6-277f-40dc-a2bc-749a97550e2e-kube-api-access-q9n2p\") pod \"network-metrics-daemon-rjx6m\" (UID: \"275a2fa6-277f-40dc-a2bc-749a97550e2e\") " pod="openshift-multus/network-metrics-daemon-rjx6m" Mar 18 16:44:38.864195 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.864157 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cc8l\" (UniqueName: \"kubernetes.io/projected/c68d2b91-1efd-47b8-93dc-98606a96920b-kube-api-access-4cc8l\") pod \"multus-74479\" (UID: \"c68d2b91-1efd-47b8-93dc-98606a96920b\") " pod="openshift-multus/multus-74479" Mar 18 16:44:38.864679 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.864661 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqdgg\" (UniqueName: \"kubernetes.io/projected/a421b67e-f253-48e4-b5a3-4a895c3bf6d2-kube-api-access-cqdgg\") pod \"node-ca-jnvc6\" (UID: \"a421b67e-f253-48e4-b5a3-4a895c3bf6d2\") " pod="openshift-image-registry/node-ca-jnvc6" Mar 18 16:44:38.864816 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.864797 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zp9d\" (UniqueName: \"kubernetes.io/projected/3e6fe5a1-3b13-4d51-a141-1280e1b25b3a-kube-api-access-8zp9d\") pod \"ovnkube-node-ts4mw\" (UID: \"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:38.864857 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.864802 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7p7j\" (UniqueName: \"kubernetes.io/projected/b0763006-cf97-41dc-a668-a271612d9705-kube-api-access-q7p7j\") pod \"tuned-85jj7\" (UID: \"b0763006-cf97-41dc-a668-a271612d9705\") " pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:38.877865 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.877847 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9fw2f" Mar 18 16:44:38.883003 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:38.882969 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a6f0eddc0950f127c5c3c1035cc71a3.slice/crio-3defc787ba4dfb988d3c92be1ce339885f30f668695c6964f9bf5b183f3f7882 WatchSource:0}: Error finding container 3defc787ba4dfb988d3c92be1ce339885f30f668695c6964f9bf5b183f3f7882: Status 404 returned error can't find the container with id 3defc787ba4dfb988d3c92be1ce339885f30f668695c6964f9bf5b183f3f7882 Mar 18 16:44:38.883961 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:38.883943 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fe312c662a48946cb6ecc07ea171293.slice/crio-b3066fec1e246079f327610b54bed5df30d073dd9b10d9025c2e55ab21d8a334 WatchSource:0}: Error finding container b3066fec1e246079f327610b54bed5df30d073dd9b10d9025c2e55ab21d8a334: Status 404 returned error can't find the container with id b3066fec1e246079f327610b54bed5df30d073dd9b10d9025c2e55ab21d8a334 Mar 18 16:44:38.890036 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.890020 2536 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:44:38.908380 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.908336 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9fw2f" event={"ID":"37693aba-cef0-4f5a-a523-bd82dbff0143","Type":"ContainerStarted","Data":"41ea9da7b97ee6f7c928443db558881346592a60a72b5f171e0229aa5c4afc0d"} Mar 18 16:44:38.909163 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.909135 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-224.ec2.internal" event={"ID":"7a6f0eddc0950f127c5c3c1035cc71a3","Type":"ContainerStarted","Data":"3defc787ba4dfb988d3c92be1ce339885f30f668695c6964f9bf5b183f3f7882"} Mar 18 16:44:38.910094 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.910072 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-224.ec2.internal" event={"ID":"3fe312c662a48946cb6ecc07ea171293","Type":"ContainerStarted","Data":"b3066fec1e246079f327610b54bed5df30d073dd9b10d9025c2e55ab21d8a334"} Mar 18 16:44:38.957333 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.957310 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/96282abf-ce09-4b33-baaf-73f9c5329541-system-cni-dir\") pod \"multus-additional-cni-plugins-d7xhk\" (UID: \"96282abf-ce09-4b33-baaf-73f9c5329541\") " pod="openshift-multus/multus-additional-cni-plugins-d7xhk" Mar 18 16:44:38.957417 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.957354 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/96282abf-ce09-4b33-baaf-73f9c5329541-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d7xhk\" (UID: \"96282abf-ce09-4b33-baaf-73f9c5329541\") " pod="openshift-multus/multus-additional-cni-plugins-d7xhk" Mar 18 16:44:38.957417 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.957396 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/96282abf-ce09-4b33-baaf-73f9c5329541-cni-binary-copy\") pod \"multus-additional-cni-plugins-d7xhk\" (UID: \"96282abf-ce09-4b33-baaf-73f9c5329541\") " pod="openshift-multus/multus-additional-cni-plugins-d7xhk" Mar 18 16:44:38.957511 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.957426 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/96282abf-ce09-4b33-baaf-73f9c5329541-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-d7xhk\" (UID: \"96282abf-ce09-4b33-baaf-73f9c5329541\") " pod="openshift-multus/multus-additional-cni-plugins-d7xhk" Mar 18 16:44:38.957511 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.957456 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/96282abf-ce09-4b33-baaf-73f9c5329541-os-release\") pod \"multus-additional-cni-plugins-d7xhk\" (UID: \"96282abf-ce09-4b33-baaf-73f9c5329541\") " pod="openshift-multus/multus-additional-cni-plugins-d7xhk" Mar 18 16:44:38.957605 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.957524 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/96282abf-ce09-4b33-baaf-73f9c5329541-os-release\") pod \"multus-additional-cni-plugins-d7xhk\" (UID: \"96282abf-ce09-4b33-baaf-73f9c5329541\") " pod="openshift-multus/multus-additional-cni-plugins-d7xhk" Mar 18 16:44:38.957605 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.957550 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/96282abf-ce09-4b33-baaf-73f9c5329541-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d7xhk\" (UID: \"96282abf-ce09-4b33-baaf-73f9c5329541\") " pod="openshift-multus/multus-additional-cni-plugins-d7xhk" Mar 18 16:44:38.957605 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.957601 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/96282abf-ce09-4b33-baaf-73f9c5329541-system-cni-dir\") pod \"multus-additional-cni-plugins-d7xhk\" (UID: \"96282abf-ce09-4b33-baaf-73f9c5329541\") " pod="openshift-multus/multus-additional-cni-plugins-d7xhk" Mar 18 16:44:38.957740 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.957632 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5gfw8\" (UniqueName: \"kubernetes.io/projected/96282abf-ce09-4b33-baaf-73f9c5329541-kube-api-access-5gfw8\") pod \"multus-additional-cni-plugins-d7xhk\" (UID: \"96282abf-ce09-4b33-baaf-73f9c5329541\") " pod="openshift-multus/multus-additional-cni-plugins-d7xhk" Mar 18 16:44:38.957786 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.957742 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fldmr\" (UniqueName: \"kubernetes.io/projected/440da786-0ff1-4727-bd36-e32a3acc5a3c-kube-api-access-fldmr\") pod \"network-check-target-cr96r\" (UID: \"440da786-0ff1-4727-bd36-e32a3acc5a3c\") " pod="openshift-network-diagnostics/network-check-target-cr96r" Mar 18 16:44:38.957834 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.957791 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/96282abf-ce09-4b33-baaf-73f9c5329541-cnibin\") pod \"multus-additional-cni-plugins-d7xhk\" (UID: \"96282abf-ce09-4b33-baaf-73f9c5329541\") " pod="openshift-multus/multus-additional-cni-plugins-d7xhk" Mar 18 16:44:38.957834 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.957811 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/96282abf-ce09-4b33-baaf-73f9c5329541-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d7xhk\" (UID: \"96282abf-ce09-4b33-baaf-73f9c5329541\") " pod="openshift-multus/multus-additional-cni-plugins-d7xhk" Mar 18 16:44:38.957929 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.957879 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/96282abf-ce09-4b33-baaf-73f9c5329541-cnibin\") pod \"multus-additional-cni-plugins-d7xhk\" (UID: \"96282abf-ce09-4b33-baaf-73f9c5329541\") " pod="openshift-multus/multus-additional-cni-plugins-d7xhk" Mar 18 16:44:38.958022 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.958002 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/96282abf-ce09-4b33-baaf-73f9c5329541-cni-binary-copy\") pod \"multus-additional-cni-plugins-d7xhk\" (UID: \"96282abf-ce09-4b33-baaf-73f9c5329541\") " pod="openshift-multus/multus-additional-cni-plugins-d7xhk" Mar 18 16:44:38.958069 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.958040 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/96282abf-ce09-4b33-baaf-73f9c5329541-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-d7xhk\" (UID: \"96282abf-ce09-4b33-baaf-73f9c5329541\") " pod="openshift-multus/multus-additional-cni-plugins-d7xhk" Mar 18 16:44:38.958189 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.958170 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/96282abf-ce09-4b33-baaf-73f9c5329541-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d7xhk\" (UID: \"96282abf-ce09-4b33-baaf-73f9c5329541\") " pod="openshift-multus/multus-additional-cni-plugins-d7xhk" Mar 18 16:44:38.964205 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:38.964190 2536 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:38.964246 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:38.964208 2536 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:38.964246 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:38.964217 2536 projected.go:194] Error preparing data for projected volume kube-api-access-fldmr for pod openshift-network-diagnostics/network-check-target-cr96r: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:38.964357 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:38.964285 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/440da786-0ff1-4727-bd36-e32a3acc5a3c-kube-api-access-fldmr podName:440da786-0ff1-4727-bd36-e32a3acc5a3c nodeName:}" failed. No retries permitted until 2026-03-18 16:44:39.464258526 +0000 UTC m=+2.237833090 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-fldmr" (UniqueName: "kubernetes.io/projected/440da786-0ff1-4727-bd36-e32a3acc5a3c-kube-api-access-fldmr") pod "network-check-target-cr96r" (UID: "440da786-0ff1-4727-bd36-e32a3acc5a3c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:38.966129 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:38.966113 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gfw8\" (UniqueName: \"kubernetes.io/projected/96282abf-ce09-4b33-baaf-73f9c5329541-kube-api-access-5gfw8\") pod \"multus-additional-cni-plugins-d7xhk\" (UID: \"96282abf-ce09-4b33-baaf-73f9c5329541\") " pod="openshift-multus/multus-additional-cni-plugins-d7xhk" Mar 18 16:44:39.065634 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:39.065606 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8pqnx" Mar 18 16:44:39.072572 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:39.072546 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc13c7653_cb9a_48c0_a38e_8c050869e5a0.slice/crio-f984b9ffa6a7d94abacba26d4ad0995b61f73091869916df0216eca7d31f67b3 WatchSource:0}: Error finding container f984b9ffa6a7d94abacba26d4ad0995b61f73091869916df0216eca7d31f67b3: Status 404 returned error can't find the container with id f984b9ffa6a7d94abacba26d4ad0995b61f73091869916df0216eca7d31f67b3 Mar 18 16:44:39.083151 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:39.083133 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-pj5bn" Mar 18 16:44:39.089456 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:39.089434 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc87e1483_58f2_438b_af86_c607ffcbf01c.slice/crio-b0dd1c06e4e20fc9505a070fb5409c9815aea305590852c2faf3f1258d4707eb WatchSource:0}: Error finding container b0dd1c06e4e20fc9505a070fb5409c9815aea305590852c2faf3f1258d4707eb: Status 404 returned error can't find the container with id b0dd1c06e4e20fc9505a070fb5409c9815aea305590852c2faf3f1258d4707eb Mar 18 16:44:39.096282 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:39.096253 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-85jj7" Mar 18 16:44:39.100746 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:39.100720 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jnvc6" Mar 18 16:44:39.102667 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:39.102645 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0763006_cf97_41dc_a668_a271612d9705.slice/crio-af45ec8bdd246bc369b859a2244ae03a89bc460848f9cee301c71412ac62f93b WatchSource:0}: Error finding container af45ec8bdd246bc369b859a2244ae03a89bc460848f9cee301c71412ac62f93b: Status 404 returned error can't find the container with id af45ec8bdd246bc369b859a2244ae03a89bc460848f9cee301c71412ac62f93b Mar 18 16:44:39.106912 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:39.106893 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda421b67e_f253_48e4_b5a3_4a895c3bf6d2.slice/crio-7575b6235feebd57399111bb5425dd4fe5c0f6e245cd26e09794b6e4bf363da8 WatchSource:0}: Error finding container 7575b6235feebd57399111bb5425dd4fe5c0f6e245cd26e09794b6e4bf363da8: Status 404 returned error can't find the container with id 7575b6235feebd57399111bb5425dd4fe5c0f6e245cd26e09794b6e4bf363da8 Mar 18 16:44:39.116315 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:39.116298 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5ms7r" Mar 18 16:44:39.121111 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:39.121094 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9464f832_e8a5_4469_9d2d_db0f0f547d86.slice/crio-21b56758d69aeba97caf151e3bd83b1464f1b74a81522b7b7a6d66a0f950defe WatchSource:0}: Error finding container 21b56758d69aeba97caf151e3bd83b1464f1b74a81522b7b7a6d66a0f950defe: Status 404 returned error can't find the container with id 21b56758d69aeba97caf151e3bd83b1464f1b74a81522b7b7a6d66a0f950defe Mar 18 16:44:39.131872 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:39.131851 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:44:39.136980 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:39.136961 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e6fe5a1_3b13_4d51_a141_1280e1b25b3a.slice/crio-64af77cdb983940d0bf085013622f22a9faa4d61839ff5c8a4c33925c3612023 WatchSource:0}: Error finding container 64af77cdb983940d0bf085013622f22a9faa4d61839ff5c8a4c33925c3612023: Status 404 returned error can't find the container with id 64af77cdb983940d0bf085013622f22a9faa4d61839ff5c8a4c33925c3612023 Mar 18 16:44:39.153816 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:39.153798 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-74479" Mar 18 16:44:39.159022 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:39.159003 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc68d2b91_1efd_47b8_93dc_98606a96920b.slice/crio-1957ad817ffde8347004e2600860082614f4ceae5b0e0dce94d5cb38136def08 WatchSource:0}: Error finding container 1957ad817ffde8347004e2600860082614f4ceae5b0e0dce94d5cb38136def08: Status 404 returned error can't find the container with id 1957ad817ffde8347004e2600860082614f4ceae5b0e0dce94d5cb38136def08 Mar 18 16:44:39.183300 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:39.183284 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-d7xhk" Mar 18 16:44:39.190445 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:44:39.190290 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96282abf_ce09_4b33_baaf_73f9c5329541.slice/crio-1bc90252ceaafcae531ac3beb5f5b174c07f5a5cbf71256384213b839a5fbab8 WatchSource:0}: Error finding container 1bc90252ceaafcae531ac3beb5f5b174c07f5a5cbf71256384213b839a5fbab8: Status 404 returned error can't find the container with id 1bc90252ceaafcae531ac3beb5f5b174c07f5a5cbf71256384213b839a5fbab8 Mar 18 16:44:39.361168 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:39.361085 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/275a2fa6-277f-40dc-a2bc-749a97550e2e-metrics-certs\") pod \"network-metrics-daemon-rjx6m\" (UID: \"275a2fa6-277f-40dc-a2bc-749a97550e2e\") " pod="openshift-multus/network-metrics-daemon-rjx6m" Mar 18 16:44:39.361168 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:39.361162 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b998ab02-161a-40e9-9e53-7183a98152de-original-pull-secret\") pod \"global-pull-secret-syncer-f52ql\" (UID: \"b998ab02-161a-40e9-9e53-7183a98152de\") " pod="kube-system/global-pull-secret-syncer-f52ql" Mar 18 16:44:39.361402 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:39.361304 2536 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:39.361402 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:39.361364 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b998ab02-161a-40e9-9e53-7183a98152de-original-pull-secret podName:b998ab02-161a-40e9-9e53-7183a98152de nodeName:}" failed. No retries permitted until 2026-03-18 16:44:40.361345774 +0000 UTC m=+3.134920342 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b998ab02-161a-40e9-9e53-7183a98152de-original-pull-secret") pod "global-pull-secret-syncer-f52ql" (UID: "b998ab02-161a-40e9-9e53-7183a98152de") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:39.361642 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:39.361617 2536 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:39.361707 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:39.361670 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/275a2fa6-277f-40dc-a2bc-749a97550e2e-metrics-certs podName:275a2fa6-277f-40dc-a2bc-749a97550e2e nodeName:}" failed. No retries permitted until 2026-03-18 16:44:40.361655448 +0000 UTC m=+3.135230019 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/275a2fa6-277f-40dc-a2bc-749a97550e2e-metrics-certs") pod "network-metrics-daemon-rjx6m" (UID: "275a2fa6-277f-40dc-a2bc-749a97550e2e") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:39.563087 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:39.562996 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fldmr\" (UniqueName: \"kubernetes.io/projected/440da786-0ff1-4727-bd36-e32a3acc5a3c-kube-api-access-fldmr\") pod \"network-check-target-cr96r\" (UID: \"440da786-0ff1-4727-bd36-e32a3acc5a3c\") " pod="openshift-network-diagnostics/network-check-target-cr96r" Mar 18 16:44:39.563244 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:39.563152 2536 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:39.563244 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:39.563174 2536 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:39.563244 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:39.563236 2536 projected.go:194] Error preparing data for projected volume kube-api-access-fldmr for pod openshift-network-diagnostics/network-check-target-cr96r: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:39.563433 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:39.563319 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/440da786-0ff1-4727-bd36-e32a3acc5a3c-kube-api-access-fldmr podName:440da786-0ff1-4727-bd36-e32a3acc5a3c nodeName:}" failed. No retries permitted until 2026-03-18 16:44:40.563300134 +0000 UTC m=+3.336874715 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-fldmr" (UniqueName: "kubernetes.io/projected/440da786-0ff1-4727-bd36-e32a3acc5a3c-kube-api-access-fldmr") pod "network-check-target-cr96r" (UID: "440da786-0ff1-4727-bd36-e32a3acc5a3c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:39.647117 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:39.647034 2536 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:39.787660 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:39.787582 2536 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-03-17 16:39:38 +0000 UTC" deadline="2027-10-16 13:56:49.552970326 +0000 UTC" Mar 18 16:44:39.787660 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:39.787613 2536 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13845h12m9.765360256s" Mar 18 16:44:39.908282 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:39.907697 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr96r" Mar 18 16:44:39.908282 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:39.907811 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr96r" podUID="440da786-0ff1-4727-bd36-e32a3acc5a3c" Mar 18 16:44:39.908282 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:39.907827 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f52ql" Mar 18 16:44:39.908282 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:39.907895 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f52ql" podUID="b998ab02-161a-40e9-9e53-7183a98152de" Mar 18 16:44:39.908282 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:39.907976 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rjx6m" Mar 18 16:44:39.908282 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:39.908056 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rjx6m" podUID="275a2fa6-277f-40dc-a2bc-749a97550e2e" Mar 18 16:44:39.950945 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:39.950900 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d7xhk" event={"ID":"96282abf-ce09-4b33-baaf-73f9c5329541","Type":"ContainerStarted","Data":"1bc90252ceaafcae531ac3beb5f5b174c07f5a5cbf71256384213b839a5fbab8"} Mar 18 16:44:39.967811 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:39.967743 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-74479" event={"ID":"c68d2b91-1efd-47b8-93dc-98606a96920b","Type":"ContainerStarted","Data":"1957ad817ffde8347004e2600860082614f4ceae5b0e0dce94d5cb38136def08"} Mar 18 16:44:39.979201 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:39.979131 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-pj5bn" event={"ID":"c87e1483-58f2-438b-af86-c607ffcbf01c","Type":"ContainerStarted","Data":"b0dd1c06e4e20fc9505a070fb5409c9815aea305590852c2faf3f1258d4707eb"} Mar 18 16:44:40.011305 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:40.006913 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" event={"ID":"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a","Type":"ContainerStarted","Data":"64af77cdb983940d0bf085013622f22a9faa4d61839ff5c8a4c33925c3612023"} Mar 18 16:44:40.019628 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:40.019577 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5ms7r" event={"ID":"9464f832-e8a5-4469-9d2d-db0f0f547d86","Type":"ContainerStarted","Data":"21b56758d69aeba97caf151e3bd83b1464f1b74a81522b7b7a6d66a0f950defe"} Mar 18 16:44:40.022822 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:40.022769 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jnvc6" event={"ID":"a421b67e-f253-48e4-b5a3-4a895c3bf6d2","Type":"ContainerStarted","Data":"7575b6235feebd57399111bb5425dd4fe5c0f6e245cd26e09794b6e4bf363da8"} Mar 18 16:44:40.034732 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:40.034706 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-85jj7" event={"ID":"b0763006-cf97-41dc-a668-a271612d9705","Type":"ContainerStarted","Data":"af45ec8bdd246bc369b859a2244ae03a89bc460848f9cee301c71412ac62f93b"} Mar 18 16:44:40.047147 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:40.047095 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8pqnx" event={"ID":"c13c7653-cb9a-48c0-a38e-8c050869e5a0","Type":"ContainerStarted","Data":"f984b9ffa6a7d94abacba26d4ad0995b61f73091869916df0216eca7d31f67b3"} Mar 18 16:44:40.071026 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:40.071001 2536 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:40.146507 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:40.144512 2536 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:40.369888 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:40.369806 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/275a2fa6-277f-40dc-a2bc-749a97550e2e-metrics-certs\") pod \"network-metrics-daemon-rjx6m\" (UID: \"275a2fa6-277f-40dc-a2bc-749a97550e2e\") " pod="openshift-multus/network-metrics-daemon-rjx6m" Mar 18 16:44:40.370092 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:40.369890 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b998ab02-161a-40e9-9e53-7183a98152de-original-pull-secret\") pod \"global-pull-secret-syncer-f52ql\" (UID: \"b998ab02-161a-40e9-9e53-7183a98152de\") " pod="kube-system/global-pull-secret-syncer-f52ql" Mar 18 16:44:40.370092 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:40.370001 2536 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:40.370092 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:40.370078 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b998ab02-161a-40e9-9e53-7183a98152de-original-pull-secret podName:b998ab02-161a-40e9-9e53-7183a98152de nodeName:}" failed. No retries permitted until 2026-03-18 16:44:42.370060127 +0000 UTC m=+5.143634696 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b998ab02-161a-40e9-9e53-7183a98152de-original-pull-secret") pod "global-pull-secret-syncer-f52ql" (UID: "b998ab02-161a-40e9-9e53-7183a98152de") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:40.370522 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:40.370502 2536 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:40.370642 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:40.370555 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/275a2fa6-277f-40dc-a2bc-749a97550e2e-metrics-certs podName:275a2fa6-277f-40dc-a2bc-749a97550e2e nodeName:}" failed. No retries permitted until 2026-03-18 16:44:42.370539903 +0000 UTC m=+5.144114477 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/275a2fa6-277f-40dc-a2bc-749a97550e2e-metrics-certs") pod "network-metrics-daemon-rjx6m" (UID: "275a2fa6-277f-40dc-a2bc-749a97550e2e") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:40.571974 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:40.571938 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fldmr\" (UniqueName: \"kubernetes.io/projected/440da786-0ff1-4727-bd36-e32a3acc5a3c-kube-api-access-fldmr\") pod \"network-check-target-cr96r\" (UID: \"440da786-0ff1-4727-bd36-e32a3acc5a3c\") " pod="openshift-network-diagnostics/network-check-target-cr96r" Mar 18 16:44:40.572139 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:40.572112 2536 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:40.572139 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:40.572131 2536 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:40.572261 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:40.572143 2536 projected.go:194] Error preparing data for projected volume kube-api-access-fldmr for pod openshift-network-diagnostics/network-check-target-cr96r: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:40.572261 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:40.572198 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/440da786-0ff1-4727-bd36-e32a3acc5a3c-kube-api-access-fldmr podName:440da786-0ff1-4727-bd36-e32a3acc5a3c nodeName:}" failed. No retries permitted until 2026-03-18 16:44:42.572178792 +0000 UTC m=+5.345753356 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-fldmr" (UniqueName: "kubernetes.io/projected/440da786-0ff1-4727-bd36-e32a3acc5a3c-kube-api-access-fldmr") pod "network-check-target-cr96r" (UID: "440da786-0ff1-4727-bd36-e32a3acc5a3c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:40.788397 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:40.788262 2536 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-03-17 16:39:38 +0000 UTC" deadline="2027-10-14 20:37:34.610133451 +0000 UTC" Mar 18 16:44:40.788397 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:40.788319 2536 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13803h52m53.821818506s" Mar 18 16:44:41.905758 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:41.905705 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rjx6m" Mar 18 16:44:41.906198 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:41.905839 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rjx6m" podUID="275a2fa6-277f-40dc-a2bc-749a97550e2e" Mar 18 16:44:41.906198 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:41.905872 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f52ql" Mar 18 16:44:41.906198 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:41.905892 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr96r" Mar 18 16:44:41.906198 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:41.905990 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f52ql" podUID="b998ab02-161a-40e9-9e53-7183a98152de" Mar 18 16:44:41.906198 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:41.906076 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr96r" podUID="440da786-0ff1-4727-bd36-e32a3acc5a3c" Mar 18 16:44:42.388644 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:42.388532 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b998ab02-161a-40e9-9e53-7183a98152de-original-pull-secret\") pod \"global-pull-secret-syncer-f52ql\" (UID: \"b998ab02-161a-40e9-9e53-7183a98152de\") " pod="kube-system/global-pull-secret-syncer-f52ql" Mar 18 16:44:42.388644 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:42.388600 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/275a2fa6-277f-40dc-a2bc-749a97550e2e-metrics-certs\") pod \"network-metrics-daemon-rjx6m\" (UID: \"275a2fa6-277f-40dc-a2bc-749a97550e2e\") " pod="openshift-multus/network-metrics-daemon-rjx6m" Mar 18 16:44:42.388858 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:42.388728 2536 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:42.388858 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:42.388786 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/275a2fa6-277f-40dc-a2bc-749a97550e2e-metrics-certs podName:275a2fa6-277f-40dc-a2bc-749a97550e2e nodeName:}" failed. No retries permitted until 2026-03-18 16:44:46.388767991 +0000 UTC m=+9.162342562 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/275a2fa6-277f-40dc-a2bc-749a97550e2e-metrics-certs") pod "network-metrics-daemon-rjx6m" (UID: "275a2fa6-277f-40dc-a2bc-749a97550e2e") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:42.389196 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:42.389153 2536 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:42.389307 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:42.389200 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b998ab02-161a-40e9-9e53-7183a98152de-original-pull-secret podName:b998ab02-161a-40e9-9e53-7183a98152de nodeName:}" failed. No retries permitted until 2026-03-18 16:44:46.389186328 +0000 UTC m=+9.162760896 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b998ab02-161a-40e9-9e53-7183a98152de-original-pull-secret") pod "global-pull-secret-syncer-f52ql" (UID: "b998ab02-161a-40e9-9e53-7183a98152de") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:42.589714 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:42.589679 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fldmr\" (UniqueName: \"kubernetes.io/projected/440da786-0ff1-4727-bd36-e32a3acc5a3c-kube-api-access-fldmr\") pod \"network-check-target-cr96r\" (UID: \"440da786-0ff1-4727-bd36-e32a3acc5a3c\") " pod="openshift-network-diagnostics/network-check-target-cr96r" Mar 18 16:44:42.589901 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:42.589851 2536 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:42.589901 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:42.589874 2536 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:42.589901 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:42.589888 2536 projected.go:194] Error preparing data for projected volume kube-api-access-fldmr for pod openshift-network-diagnostics/network-check-target-cr96r: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:42.590109 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:42.589949 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/440da786-0ff1-4727-bd36-e32a3acc5a3c-kube-api-access-fldmr podName:440da786-0ff1-4727-bd36-e32a3acc5a3c nodeName:}" failed. No retries permitted until 2026-03-18 16:44:46.589929524 +0000 UTC m=+9.363504091 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-fldmr" (UniqueName: "kubernetes.io/projected/440da786-0ff1-4727-bd36-e32a3acc5a3c-kube-api-access-fldmr") pod "network-check-target-cr96r" (UID: "440da786-0ff1-4727-bd36-e32a3acc5a3c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:43.905430 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:43.905396 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr96r" Mar 18 16:44:43.905846 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:43.905489 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr96r" podUID="440da786-0ff1-4727-bd36-e32a3acc5a3c" Mar 18 16:44:43.905846 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:43.905805 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f52ql" Mar 18 16:44:43.905942 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:43.905900 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f52ql" podUID="b998ab02-161a-40e9-9e53-7183a98152de" Mar 18 16:44:43.906065 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:43.905964 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rjx6m" Mar 18 16:44:43.906065 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:43.906050 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rjx6m" podUID="275a2fa6-277f-40dc-a2bc-749a97550e2e" Mar 18 16:44:45.906490 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:45.906452 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rjx6m" Mar 18 16:44:45.906938 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:45.906499 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr96r" Mar 18 16:44:45.906938 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:45.906615 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rjx6m" podUID="275a2fa6-277f-40dc-a2bc-749a97550e2e" Mar 18 16:44:45.906938 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:45.906746 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f52ql" Mar 18 16:44:45.906938 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:45.906849 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f52ql" podUID="b998ab02-161a-40e9-9e53-7183a98152de" Mar 18 16:44:45.906938 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:45.906923 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr96r" podUID="440da786-0ff1-4727-bd36-e32a3acc5a3c" Mar 18 16:44:46.420965 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:46.420202 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b998ab02-161a-40e9-9e53-7183a98152de-original-pull-secret\") pod \"global-pull-secret-syncer-f52ql\" (UID: \"b998ab02-161a-40e9-9e53-7183a98152de\") " pod="kube-system/global-pull-secret-syncer-f52ql" Mar 18 16:44:46.420965 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:46.420289 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/275a2fa6-277f-40dc-a2bc-749a97550e2e-metrics-certs\") pod \"network-metrics-daemon-rjx6m\" (UID: \"275a2fa6-277f-40dc-a2bc-749a97550e2e\") " pod="openshift-multus/network-metrics-daemon-rjx6m" Mar 18 16:44:46.420965 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:46.420424 2536 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:46.420965 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:46.420488 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/275a2fa6-277f-40dc-a2bc-749a97550e2e-metrics-certs podName:275a2fa6-277f-40dc-a2bc-749a97550e2e nodeName:}" failed. No retries permitted until 2026-03-18 16:44:54.420467809 +0000 UTC m=+17.194042378 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/275a2fa6-277f-40dc-a2bc-749a97550e2e-metrics-certs") pod "network-metrics-daemon-rjx6m" (UID: "275a2fa6-277f-40dc-a2bc-749a97550e2e") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:46.420965 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:46.420882 2536 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:46.420965 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:46.420930 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b998ab02-161a-40e9-9e53-7183a98152de-original-pull-secret podName:b998ab02-161a-40e9-9e53-7183a98152de nodeName:}" failed. No retries permitted until 2026-03-18 16:44:54.42091586 +0000 UTC m=+17.194490431 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b998ab02-161a-40e9-9e53-7183a98152de-original-pull-secret") pod "global-pull-secret-syncer-f52ql" (UID: "b998ab02-161a-40e9-9e53-7183a98152de") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:46.622161 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:46.621791 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fldmr\" (UniqueName: \"kubernetes.io/projected/440da786-0ff1-4727-bd36-e32a3acc5a3c-kube-api-access-fldmr\") pod \"network-check-target-cr96r\" (UID: \"440da786-0ff1-4727-bd36-e32a3acc5a3c\") " pod="openshift-network-diagnostics/network-check-target-cr96r" Mar 18 16:44:46.622161 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:46.622010 2536 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:46.622161 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:46.622032 2536 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:46.622161 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:46.622044 2536 projected.go:194] Error preparing data for projected volume kube-api-access-fldmr for pod openshift-network-diagnostics/network-check-target-cr96r: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:46.622161 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:46.622102 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/440da786-0ff1-4727-bd36-e32a3acc5a3c-kube-api-access-fldmr podName:440da786-0ff1-4727-bd36-e32a3acc5a3c nodeName:}" failed. No retries permitted until 2026-03-18 16:44:54.62208389 +0000 UTC m=+17.395658459 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-fldmr" (UniqueName: "kubernetes.io/projected/440da786-0ff1-4727-bd36-e32a3acc5a3c-kube-api-access-fldmr") pod "network-check-target-cr96r" (UID: "440da786-0ff1-4727-bd36-e32a3acc5a3c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:47.906751 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:47.906485 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr96r" Mar 18 16:44:47.906751 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:47.906596 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr96r" podUID="440da786-0ff1-4727-bd36-e32a3acc5a3c" Mar 18 16:44:47.906751 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:47.906695 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f52ql" Mar 18 16:44:47.907332 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:47.906801 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f52ql" podUID="b998ab02-161a-40e9-9e53-7183a98152de" Mar 18 16:44:47.907332 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:47.906845 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rjx6m" Mar 18 16:44:47.907332 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:47.906956 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rjx6m" podUID="275a2fa6-277f-40dc-a2bc-749a97550e2e" Mar 18 16:44:49.908493 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:49.908460 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr96r" Mar 18 16:44:49.908493 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:49.908489 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f52ql" Mar 18 16:44:49.908953 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:49.908562 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr96r" podUID="440da786-0ff1-4727-bd36-e32a3acc5a3c" Mar 18 16:44:49.908953 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:49.908620 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rjx6m" Mar 18 16:44:49.908953 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:49.908703 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rjx6m" podUID="275a2fa6-277f-40dc-a2bc-749a97550e2e" Mar 18 16:44:49.908953 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:49.908790 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f52ql" podUID="b998ab02-161a-40e9-9e53-7183a98152de" Mar 18 16:44:51.905727 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:51.905676 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr96r" Mar 18 16:44:51.906199 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:51.905777 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rjx6m" Mar 18 16:44:51.906199 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:51.905802 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr96r" podUID="440da786-0ff1-4727-bd36-e32a3acc5a3c" Mar 18 16:44:51.906199 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:51.905891 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rjx6m" podUID="275a2fa6-277f-40dc-a2bc-749a97550e2e" Mar 18 16:44:51.906199 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:51.905940 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f52ql" Mar 18 16:44:51.906199 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:51.906038 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f52ql" podUID="b998ab02-161a-40e9-9e53-7183a98152de" Mar 18 16:44:53.905920 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:53.905882 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rjx6m" Mar 18 16:44:53.906445 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:53.905888 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f52ql" Mar 18 16:44:53.906445 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:53.905882 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr96r" Mar 18 16:44:53.906445 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:53.906078 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rjx6m" podUID="275a2fa6-277f-40dc-a2bc-749a97550e2e" Mar 18 16:44:53.906445 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:53.906178 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f52ql" podUID="b998ab02-161a-40e9-9e53-7183a98152de" Mar 18 16:44:53.906445 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:53.906310 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr96r" podUID="440da786-0ff1-4727-bd36-e32a3acc5a3c" Mar 18 16:44:54.482408 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:54.482373 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b998ab02-161a-40e9-9e53-7183a98152de-original-pull-secret\") pod \"global-pull-secret-syncer-f52ql\" (UID: \"b998ab02-161a-40e9-9e53-7183a98152de\") " pod="kube-system/global-pull-secret-syncer-f52ql" Mar 18 16:44:54.482600 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:54.482423 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/275a2fa6-277f-40dc-a2bc-749a97550e2e-metrics-certs\") pod \"network-metrics-daemon-rjx6m\" (UID: \"275a2fa6-277f-40dc-a2bc-749a97550e2e\") " pod="openshift-multus/network-metrics-daemon-rjx6m" Mar 18 16:44:54.482600 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:54.482534 2536 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:54.482600 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:54.482535 2536 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:54.482600 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:54.482596 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/275a2fa6-277f-40dc-a2bc-749a97550e2e-metrics-certs podName:275a2fa6-277f-40dc-a2bc-749a97550e2e nodeName:}" failed. No retries permitted until 2026-03-18 16:45:10.48258044 +0000 UTC m=+33.256155003 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/275a2fa6-277f-40dc-a2bc-749a97550e2e-metrics-certs") pod "network-metrics-daemon-rjx6m" (UID: "275a2fa6-277f-40dc-a2bc-749a97550e2e") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:54.482765 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:54.482610 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b998ab02-161a-40e9-9e53-7183a98152de-original-pull-secret podName:b998ab02-161a-40e9-9e53-7183a98152de nodeName:}" failed. No retries permitted until 2026-03-18 16:45:10.482604198 +0000 UTC m=+33.256178763 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b998ab02-161a-40e9-9e53-7183a98152de-original-pull-secret") pod "global-pull-secret-syncer-f52ql" (UID: "b998ab02-161a-40e9-9e53-7183a98152de") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:54.684849 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:54.684817 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fldmr\" (UniqueName: \"kubernetes.io/projected/440da786-0ff1-4727-bd36-e32a3acc5a3c-kube-api-access-fldmr\") pod \"network-check-target-cr96r\" (UID: \"440da786-0ff1-4727-bd36-e32a3acc5a3c\") " pod="openshift-network-diagnostics/network-check-target-cr96r" Mar 18 16:44:54.685154 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:54.684952 2536 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:54.685154 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:54.684970 2536 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:54.685154 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:54.684979 2536 projected.go:194] Error preparing data for projected volume kube-api-access-fldmr for pod openshift-network-diagnostics/network-check-target-cr96r: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:54.685154 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:54.685025 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/440da786-0ff1-4727-bd36-e32a3acc5a3c-kube-api-access-fldmr podName:440da786-0ff1-4727-bd36-e32a3acc5a3c nodeName:}" failed. No retries permitted until 2026-03-18 16:45:10.685012879 +0000 UTC m=+33.458587444 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-fldmr" (UniqueName: "kubernetes.io/projected/440da786-0ff1-4727-bd36-e32a3acc5a3c-kube-api-access-fldmr") pod "network-check-target-cr96r" (UID: "440da786-0ff1-4727-bd36-e32a3acc5a3c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:55.905411 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:55.905346 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr96r" Mar 18 16:44:55.905856 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:55.905466 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr96r" podUID="440da786-0ff1-4727-bd36-e32a3acc5a3c" Mar 18 16:44:55.905856 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:55.905346 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rjx6m" Mar 18 16:44:55.905856 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:55.905565 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rjx6m" podUID="275a2fa6-277f-40dc-a2bc-749a97550e2e" Mar 18 16:44:55.905856 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:55.905346 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f52ql" Mar 18 16:44:55.905856 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:55.905640 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f52ql" podUID="b998ab02-161a-40e9-9e53-7183a98152de" Mar 18 16:44:57.906567 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:57.906363 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr96r" Mar 18 16:44:57.907057 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:57.906426 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rjx6m" Mar 18 16:44:57.907057 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:57.906657 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr96r" podUID="440da786-0ff1-4727-bd36-e32a3acc5a3c" Mar 18 16:44:57.907057 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:57.906447 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f52ql" Mar 18 16:44:57.907057 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:57.906746 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rjx6m" podUID="275a2fa6-277f-40dc-a2bc-749a97550e2e" Mar 18 16:44:57.907057 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:57.906791 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f52ql" podUID="b998ab02-161a-40e9-9e53-7183a98152de" Mar 18 16:44:58.083004 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:58.082966 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-pj5bn" event={"ID":"c87e1483-58f2-438b-af86-c607ffcbf01c","Type":"ContainerStarted","Data":"460e86afccf45febd0aa82fe81a312a19f4f80826c67d1b16c57a2e130e9d4c1"} Mar 18 16:44:58.084308 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:58.084258 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9fw2f" event={"ID":"37693aba-cef0-4f5a-a523-bd82dbff0143","Type":"ContainerStarted","Data":"3d87ead8c3a6a9fb1c7b37ece387bdebba1516543ca7552af622d186ec469401"} Mar 18 16:44:58.086568 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:58.086548 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ts4mw_3e6fe5a1-3b13-4d51-a141-1280e1b25b3a/ovn-acl-logging/0.log" Mar 18 16:44:58.086833 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:58.086815 2536 generic.go:358] "Generic (PLEG): container finished" podID="3e6fe5a1-3b13-4d51-a141-1280e1b25b3a" containerID="4ef9ff2885e0555dc0a22b1512f2b38174145e3553f1375a9d17c40a9cb4e479" exitCode=1 Mar 18 16:44:58.086896 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:58.086875 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" event={"ID":"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a","Type":"ContainerStarted","Data":"4528cf13d8879d2e6d8f99c9fa48b0622bb3ff98102bc7646b6228588b915bfb"} Mar 18 16:44:58.086951 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:58.086904 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" event={"ID":"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a","Type":"ContainerStarted","Data":"ad25bed540e745dcf76eb97b7f5291055d55cabe6ccb380b180ab85b9043a577"} Mar 18 16:44:58.086951 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:58.086914 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" event={"ID":"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a","Type":"ContainerStarted","Data":"5df24474ad8274a7b204ee336856d2a507b7cdbe35d0e90db9f1d63048896480"} Mar 18 16:44:58.086951 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:58.086922 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" event={"ID":"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a","Type":"ContainerStarted","Data":"f8e6d4501726010ce7ee24bcda70b4863b16b1ddbe4c8684becb0a2444457d4d"} Mar 18 16:44:58.086951 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:58.086930 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" event={"ID":"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a","Type":"ContainerDied","Data":"4ef9ff2885e0555dc0a22b1512f2b38174145e3553f1375a9d17c40a9cb4e479"} Mar 18 16:44:58.086951 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:58.086940 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" event={"ID":"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a","Type":"ContainerStarted","Data":"ee63164b2172effe92c6a320f515e23f71f01e7515076a83a33ee11e0ffbdb12"} Mar 18 16:44:58.087971 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:58.087948 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jnvc6" event={"ID":"a421b67e-f253-48e4-b5a3-4a895c3bf6d2","Type":"ContainerStarted","Data":"74a8ed5c696ce4b540acaa1a32b6957b7f649adb93080fed85a22d9fbbc389eb"} Mar 18 16:44:58.089153 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:58.089134 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-85jj7" event={"ID":"b0763006-cf97-41dc-a668-a271612d9705","Type":"ContainerStarted","Data":"0da0c9a2a0b8868ff0581d013f33afb46c3ab724aa49833fd14e857ad122701e"} Mar 18 16:44:58.090385 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:58.090367 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8pqnx" event={"ID":"c13c7653-cb9a-48c0-a38e-8c050869e5a0","Type":"ContainerStarted","Data":"df2cfe1e6974ed2adae59eb81080ee0612ce7dc37a34969addd40ec9225f0da6"} Mar 18 16:44:58.091562 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:58.091540 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-224.ec2.internal" event={"ID":"7a6f0eddc0950f127c5c3c1035cc71a3","Type":"ContainerStarted","Data":"c7b08acbdba17b9034c1481732ff73426557457cc0b827a9bf704cda49a37785"} Mar 18 16:44:58.092718 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:58.092699 2536 generic.go:358] "Generic (PLEG): container finished" podID="3fe312c662a48946cb6ecc07ea171293" containerID="ea2f5fa4b5b1f917708c9c729339185308f25069827060641b9e245a0c4e12b5" exitCode=0 Mar 18 16:44:58.092818 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:58.092752 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-224.ec2.internal" event={"ID":"3fe312c662a48946cb6ecc07ea171293","Type":"ContainerDied","Data":"ea2f5fa4b5b1f917708c9c729339185308f25069827060641b9e245a0c4e12b5"} Mar 18 16:44:58.094006 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:58.093983 2536 generic.go:358] "Generic (PLEG): container finished" podID="96282abf-ce09-4b33-baaf-73f9c5329541" containerID="4f666fec157191e471fa88a771de29cb31268790e8d54cfca4b8983ec8756c5f" exitCode=0 Mar 18 16:44:58.094100 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:58.094049 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d7xhk" event={"ID":"96282abf-ce09-4b33-baaf-73f9c5329541","Type":"ContainerDied","Data":"4f666fec157191e471fa88a771de29cb31268790e8d54cfca4b8983ec8756c5f"} Mar 18 16:44:58.095207 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:58.095183 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-74479" event={"ID":"c68d2b91-1efd-47b8-93dc-98606a96920b","Type":"ContainerStarted","Data":"a8d29dba8655cf30e7b83b5291db1059d71304a9e038d4fab91c4e1c829b03d0"} Mar 18 16:44:58.098903 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:58.098862 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-pj5bn" podStartSLOduration=3.099268743 podStartE2EDuration="21.098850583s" podCreationTimestamp="2026-03-18 16:44:37 +0000 UTC" firstStartedPulling="2026-03-18 16:44:39.09085357 +0000 UTC m=+1.864428134" lastFinishedPulling="2026-03-18 16:44:57.090435397 +0000 UTC m=+19.864009974" observedRunningTime="2026-03-18 16:44:58.098438995 +0000 UTC m=+20.872013580" watchObservedRunningTime="2026-03-18 16:44:58.098850583 +0000 UTC m=+20.872425169" Mar 18 16:44:58.114586 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:58.114215 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-9fw2f" podStartSLOduration=1.914557264 podStartE2EDuration="20.114198543s" podCreationTimestamp="2026-03-18 16:44:38 +0000 UTC" firstStartedPulling="2026-03-18 16:44:38.890215238 +0000 UTC m=+1.663789806" lastFinishedPulling="2026-03-18 16:44:57.089856509 +0000 UTC m=+19.863431085" observedRunningTime="2026-03-18 16:44:58.113578481 +0000 UTC m=+20.887153066" watchObservedRunningTime="2026-03-18 16:44:58.114198543 +0000 UTC m=+20.887773127" Mar 18 16:44:58.126129 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:58.126084 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-224.ec2.internal" podStartSLOduration=20.126070152 podStartE2EDuration="20.126070152s" podCreationTimestamp="2026-03-18 16:44:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:44:58.12570134 +0000 UTC m=+20.899275927" watchObservedRunningTime="2026-03-18 16:44:58.126070152 +0000 UTC m=+20.899644738" Mar 18 16:44:58.140892 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:58.140842 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-85jj7" podStartSLOduration=3.15382855 podStartE2EDuration="21.140830463s" podCreationTimestamp="2026-03-18 16:44:37 +0000 UTC" firstStartedPulling="2026-03-18 16:44:39.104168794 +0000 UTC m=+1.877743357" lastFinishedPulling="2026-03-18 16:44:57.091170692 +0000 UTC m=+19.864745270" observedRunningTime="2026-03-18 16:44:58.14060368 +0000 UTC m=+20.914178271" watchObservedRunningTime="2026-03-18 16:44:58.140830463 +0000 UTC m=+20.914405050" Mar 18 16:44:58.191892 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:58.191852 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jnvc6" podStartSLOduration=3.209500932 podStartE2EDuration="21.19183907s" podCreationTimestamp="2026-03-18 16:44:37 +0000 UTC" firstStartedPulling="2026-03-18 16:44:39.108249422 +0000 UTC m=+1.881823989" lastFinishedPulling="2026-03-18 16:44:57.090587563 +0000 UTC m=+19.864162127" observedRunningTime="2026-03-18 16:44:58.191404969 +0000 UTC m=+20.964979580" watchObservedRunningTime="2026-03-18 16:44:58.19183907 +0000 UTC m=+20.965413700" Mar 18 16:44:58.214603 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:58.214568 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-74479" podStartSLOduration=3.197263373 podStartE2EDuration="21.214555297s" podCreationTimestamp="2026-03-18 16:44:37 +0000 UTC" firstStartedPulling="2026-03-18 16:44:39.160445166 +0000 UTC m=+1.934019734" lastFinishedPulling="2026-03-18 16:44:57.177737083 +0000 UTC m=+19.951311658" observedRunningTime="2026-03-18 16:44:58.214471427 +0000 UTC m=+20.988046023" watchObservedRunningTime="2026-03-18 16:44:58.214555297 +0000 UTC m=+20.988129883" Mar 18 16:44:58.700711 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:58.700682 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-pj5bn" Mar 18 16:44:58.737240 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:58.737205 2536 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Mar 18 16:44:58.819403 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:58.819219 2536 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-03-18T16:44:58.737222137Z","UUID":"038f050a-0ff1-4e00-8668-b5aeea1c9b74","Handler":null,"Name":"","Endpoint":""} Mar 18 16:44:58.821165 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:58.821144 2536 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Mar 18 16:44:58.821304 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:58.821172 2536 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Mar 18 16:44:58.902410 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:58.902375 2536 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-pj5bn" Mar 18 16:44:58.903134 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:58.903111 2536 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-pj5bn" Mar 18 16:44:59.098704 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:59.098616 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5ms7r" event={"ID":"9464f832-e8a5-4469-9d2d-db0f0f547d86","Type":"ContainerStarted","Data":"b16e81be66e16c40970c3bce779ae1dc2088699a225244346344a7184b30b32c"} Mar 18 16:44:59.100467 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:59.100432 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8pqnx" event={"ID":"c13c7653-cb9a-48c0-a38e-8c050869e5a0","Type":"ContainerStarted","Data":"b8390e5616cd38486e429037cd30fc5550698cbded2dc2f6136febc66535272a"} Mar 18 16:44:59.102408 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:59.102301 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-224.ec2.internal" event={"ID":"3fe312c662a48946cb6ecc07ea171293","Type":"ContainerStarted","Data":"0f1497024f200437288cc4f17b6cf15823309ae25cf41f7b757364636761d1e0"} Mar 18 16:44:59.103800 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:59.103778 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-pj5bn" Mar 18 16:44:59.113140 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:59.113074 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-5ms7r" podStartSLOduration=8.054569434 podStartE2EDuration="22.113062848s" podCreationTimestamp="2026-03-18 16:44:37 +0000 UTC" firstStartedPulling="2026-03-18 16:44:39.122502893 +0000 UTC m=+1.896077461" lastFinishedPulling="2026-03-18 16:44:53.180996296 +0000 UTC m=+15.954570875" observedRunningTime="2026-03-18 16:44:59.112467677 +0000 UTC m=+21.886042278" watchObservedRunningTime="2026-03-18 16:44:59.113062848 +0000 UTC m=+21.886637435" Mar 18 16:44:59.130896 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:59.130853 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-224.ec2.internal" podStartSLOduration=21.130835703 podStartE2EDuration="21.130835703s" podCreationTimestamp="2026-03-18 16:44:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:44:59.130096362 +0000 UTC m=+21.903670949" watchObservedRunningTime="2026-03-18 16:44:59.130835703 +0000 UTC m=+21.904410289" Mar 18 16:44:59.906084 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:59.905798 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f52ql" Mar 18 16:44:59.906335 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:59.905808 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr96r" Mar 18 16:44:59.906335 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:59.906137 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f52ql" podUID="b998ab02-161a-40e9-9e53-7183a98152de" Mar 18 16:44:59.906335 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:44:59.905832 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rjx6m" Mar 18 16:44:59.906335 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:59.906232 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr96r" podUID="440da786-0ff1-4727-bd36-e32a3acc5a3c" Mar 18 16:44:59.906335 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:44:59.906320 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rjx6m" podUID="275a2fa6-277f-40dc-a2bc-749a97550e2e" Mar 18 16:45:00.105791 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:00.105755 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8pqnx" event={"ID":"c13c7653-cb9a-48c0-a38e-8c050869e5a0","Type":"ContainerStarted","Data":"90e83938813623029db02c1b33b8328992465228c49e413a12120a4d58a0650e"} Mar 18 16:45:00.109042 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:00.109022 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ts4mw_3e6fe5a1-3b13-4d51-a141-1280e1b25b3a/ovn-acl-logging/0.log" Mar 18 16:45:00.109382 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:00.109356 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" event={"ID":"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a","Type":"ContainerStarted","Data":"e2f240644d1814f67bdcd4ff273f530e6e640662d18153bdb6d8e797001ba4cc"} Mar 18 16:45:00.122763 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:00.122717 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8pqnx" podStartSLOduration=2.650446574 podStartE2EDuration="23.122700561s" podCreationTimestamp="2026-03-18 16:44:37 +0000 UTC" firstStartedPulling="2026-03-18 16:44:39.074306589 +0000 UTC m=+1.847881156" lastFinishedPulling="2026-03-18 16:44:59.546560578 +0000 UTC m=+22.320135143" observedRunningTime="2026-03-18 16:45:00.122686677 +0000 UTC m=+22.896261264" watchObservedRunningTime="2026-03-18 16:45:00.122700561 +0000 UTC m=+22.896275149" Mar 18 16:45:01.906000 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:01.905968 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rjx6m" Mar 18 16:45:01.906565 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:01.905968 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f52ql" Mar 18 16:45:01.906565 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:01.906078 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rjx6m" podUID="275a2fa6-277f-40dc-a2bc-749a97550e2e" Mar 18 16:45:01.906565 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:01.905968 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr96r" Mar 18 16:45:01.906565 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:01.906156 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f52ql" podUID="b998ab02-161a-40e9-9e53-7183a98152de" Mar 18 16:45:01.906565 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:01.906223 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr96r" podUID="440da786-0ff1-4727-bd36-e32a3acc5a3c" Mar 18 16:45:03.116059 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:03.115826 2536 generic.go:358] "Generic (PLEG): container finished" podID="96282abf-ce09-4b33-baaf-73f9c5329541" containerID="cd69642f9f5db8d97ee841d5fcc08cf07ae50a1e5e8039954c4f29df702a2bea" exitCode=0 Mar 18 16:45:03.116717 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:03.115908 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d7xhk" event={"ID":"96282abf-ce09-4b33-baaf-73f9c5329541","Type":"ContainerDied","Data":"cd69642f9f5db8d97ee841d5fcc08cf07ae50a1e5e8039954c4f29df702a2bea"} Mar 18 16:45:03.119206 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:03.119190 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ts4mw_3e6fe5a1-3b13-4d51-a141-1280e1b25b3a/ovn-acl-logging/0.log" Mar 18 16:45:03.119583 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:03.119555 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" event={"ID":"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a","Type":"ContainerStarted","Data":"387c36c9407eb526fb84abb50191aec01f13f3c598b1f1863bdfa55ece9161d9"} Mar 18 16:45:03.119885 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:03.119865 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:45:03.119974 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:03.119895 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:45:03.119974 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:03.119908 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:45:03.120113 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:03.119996 2536 scope.go:117] "RemoveContainer" containerID="4ef9ff2885e0555dc0a22b1512f2b38174145e3553f1375a9d17c40a9cb4e479" Mar 18 16:45:03.135001 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:03.134981 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:45:03.136100 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:03.136047 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:45:03.906246 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:03.906188 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f52ql" Mar 18 16:45:03.906417 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:03.906190 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr96r" Mar 18 16:45:03.906417 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:03.906403 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr96r" podUID="440da786-0ff1-4727-bd36-e32a3acc5a3c" Mar 18 16:45:03.906541 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:03.906432 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rjx6m" Mar 18 16:45:03.906592 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:03.906551 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rjx6m" podUID="275a2fa6-277f-40dc-a2bc-749a97550e2e" Mar 18 16:45:03.906662 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:03.906638 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f52ql" podUID="b998ab02-161a-40e9-9e53-7183a98152de" Mar 18 16:45:04.123818 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:04.123752 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ts4mw_3e6fe5a1-3b13-4d51-a141-1280e1b25b3a/ovn-acl-logging/0.log" Mar 18 16:45:04.124194 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:04.124060 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" event={"ID":"3e6fe5a1-3b13-4d51-a141-1280e1b25b3a","Type":"ContainerStarted","Data":"9f1b651920955ea094dd95ee01e8d6aa824cc9309aa60a516a3a04151ba3a4f7"} Mar 18 16:45:04.125700 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:04.125676 2536 generic.go:358] "Generic (PLEG): container finished" podID="96282abf-ce09-4b33-baaf-73f9c5329541" containerID="08d0313510b7f0c2d9877ebfd41c778b5a2771e80d1a71b39e5c9f11e23d8baf" exitCode=0 Mar 18 16:45:04.125803 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:04.125717 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d7xhk" event={"ID":"96282abf-ce09-4b33-baaf-73f9c5329541","Type":"ContainerDied","Data":"08d0313510b7f0c2d9877ebfd41c778b5a2771e80d1a71b39e5c9f11e23d8baf"} Mar 18 16:45:04.150571 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:04.150534 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" podStartSLOduration=9.183377889 podStartE2EDuration="27.15052285s" podCreationTimestamp="2026-03-18 16:44:37 +0000 UTC" firstStartedPulling="2026-03-18 16:44:39.138453095 +0000 UTC m=+1.912027659" lastFinishedPulling="2026-03-18 16:44:57.105598056 +0000 UTC m=+19.879172620" observedRunningTime="2026-03-18 16:45:04.150020708 +0000 UTC m=+26.923595293" watchObservedRunningTime="2026-03-18 16:45:04.15052285 +0000 UTC m=+26.924097435" Mar 18 16:45:04.168653 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:04.168633 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-cr96r"] Mar 18 16:45:04.168745 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:04.168697 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr96r" Mar 18 16:45:04.168803 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:04.168765 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr96r" podUID="440da786-0ff1-4727-bd36-e32a3acc5a3c" Mar 18 16:45:04.174713 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:04.174689 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-f52ql"] Mar 18 16:45:04.174814 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:04.174774 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f52ql" Mar 18 16:45:04.174909 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:04.174881 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f52ql" podUID="b998ab02-161a-40e9-9e53-7183a98152de" Mar 18 16:45:04.175284 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:04.175252 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rjx6m"] Mar 18 16:45:04.175368 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:04.175354 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rjx6m" Mar 18 16:45:04.175476 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:04.175452 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rjx6m" podUID="275a2fa6-277f-40dc-a2bc-749a97550e2e" Mar 18 16:45:05.129915 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:05.129710 2536 generic.go:358] "Generic (PLEG): container finished" podID="96282abf-ce09-4b33-baaf-73f9c5329541" containerID="7644a7b4cb6ab325b4d9dff0f50c824cba4b837a2605184903b2697e6390fdc0" exitCode=0 Mar 18 16:45:05.129915 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:05.129795 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d7xhk" event={"ID":"96282abf-ce09-4b33-baaf-73f9c5329541","Type":"ContainerDied","Data":"7644a7b4cb6ab325b4d9dff0f50c824cba4b837a2605184903b2697e6390fdc0"} Mar 18 16:45:05.905970 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:05.905941 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr96r" Mar 18 16:45:05.905970 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:05.905966 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f52ql" Mar 18 16:45:05.906153 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:05.905950 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rjx6m" Mar 18 16:45:05.906153 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:05.906068 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr96r" podUID="440da786-0ff1-4727-bd36-e32a3acc5a3c" Mar 18 16:45:05.906247 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:05.906156 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rjx6m" podUID="275a2fa6-277f-40dc-a2bc-749a97550e2e" Mar 18 16:45:05.906331 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:05.906251 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f52ql" podUID="b998ab02-161a-40e9-9e53-7183a98152de" Mar 18 16:45:07.906604 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:07.906521 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr96r" Mar 18 16:45:07.907147 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:07.906635 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cr96r" podUID="440da786-0ff1-4727-bd36-e32a3acc5a3c" Mar 18 16:45:07.907147 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:07.906712 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f52ql" Mar 18 16:45:07.907147 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:07.906823 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f52ql" podUID="b998ab02-161a-40e9-9e53-7183a98152de" Mar 18 16:45:07.907147 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:07.906856 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rjx6m" Mar 18 16:45:07.907147 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:07.906961 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rjx6m" podUID="275a2fa6-277f-40dc-a2bc-749a97550e2e" Mar 18 16:45:08.990127 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:08.990096 2536 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-224.ec2.internal" event="NodeReady" Mar 18 16:45:08.990671 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:08.990242 2536 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Mar 18 16:45:09.037615 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.037589 2536 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-x862v"] Mar 18 16:45:09.069983 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.069901 2536 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-l45t6"] Mar 18 16:45:09.070122 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.069982 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-x862v" Mar 18 16:45:09.075055 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.075032 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Mar 18 16:45:09.075331 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.075311 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Mar 18 16:45:09.075464 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.075446 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Mar 18 16:45:09.075517 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.075469 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dz9s5\"" Mar 18 16:45:09.086107 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.086054 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-x862v"] Mar 18 16:45:09.086107 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.086085 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-l45t6"] Mar 18 16:45:09.086259 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.086184 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-l45t6" Mar 18 16:45:09.088288 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.088247 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Mar 18 16:45:09.088417 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.088404 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Mar 18 16:45:09.088495 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.088424 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wnp77\"" Mar 18 16:45:09.194356 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.194324 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f569b4c-304c-4347-827a-116204073ddf-cert\") pod \"ingress-canary-x862v\" (UID: \"0f569b4c-304c-4347-827a-116204073ddf\") " pod="openshift-ingress-canary/ingress-canary-x862v" Mar 18 16:45:09.194515 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.194400 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7zpj\" (UniqueName: \"kubernetes.io/projected/0f569b4c-304c-4347-827a-116204073ddf-kube-api-access-h7zpj\") pod \"ingress-canary-x862v\" (UID: \"0f569b4c-304c-4347-827a-116204073ddf\") " pod="openshift-ingress-canary/ingress-canary-x862v" Mar 18 16:45:09.194515 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.194447 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c40b2ed4-792d-4afc-bc1a-3aad44ac26e0-config-volume\") pod \"dns-default-l45t6\" (UID: \"c40b2ed4-792d-4afc-bc1a-3aad44ac26e0\") " pod="openshift-dns/dns-default-l45t6" Mar 18 16:45:09.194515 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.194472 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c40b2ed4-792d-4afc-bc1a-3aad44ac26e0-tmp-dir\") pod \"dns-default-l45t6\" (UID: \"c40b2ed4-792d-4afc-bc1a-3aad44ac26e0\") " pod="openshift-dns/dns-default-l45t6" Mar 18 16:45:09.194676 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.194518 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c40b2ed4-792d-4afc-bc1a-3aad44ac26e0-metrics-tls\") pod \"dns-default-l45t6\" (UID: \"c40b2ed4-792d-4afc-bc1a-3aad44ac26e0\") " pod="openshift-dns/dns-default-l45t6" Mar 18 16:45:09.194676 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.194598 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mrx9\" (UniqueName: \"kubernetes.io/projected/c40b2ed4-792d-4afc-bc1a-3aad44ac26e0-kube-api-access-9mrx9\") pod \"dns-default-l45t6\" (UID: \"c40b2ed4-792d-4afc-bc1a-3aad44ac26e0\") " pod="openshift-dns/dns-default-l45t6" Mar 18 16:45:09.295324 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.295291 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f569b4c-304c-4347-827a-116204073ddf-cert\") pod \"ingress-canary-x862v\" (UID: \"0f569b4c-304c-4347-827a-116204073ddf\") " pod="openshift-ingress-canary/ingress-canary-x862v" Mar 18 16:45:09.295478 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.295333 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h7zpj\" (UniqueName: \"kubernetes.io/projected/0f569b4c-304c-4347-827a-116204073ddf-kube-api-access-h7zpj\") pod \"ingress-canary-x862v\" (UID: \"0f569b4c-304c-4347-827a-116204073ddf\") " pod="openshift-ingress-canary/ingress-canary-x862v" Mar 18 16:45:09.295478 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.295362 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c40b2ed4-792d-4afc-bc1a-3aad44ac26e0-config-volume\") pod \"dns-default-l45t6\" (UID: \"c40b2ed4-792d-4afc-bc1a-3aad44ac26e0\") " pod="openshift-dns/dns-default-l45t6" Mar 18 16:45:09.295478 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.295385 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c40b2ed4-792d-4afc-bc1a-3aad44ac26e0-tmp-dir\") pod \"dns-default-l45t6\" (UID: \"c40b2ed4-792d-4afc-bc1a-3aad44ac26e0\") " pod="openshift-dns/dns-default-l45t6" Mar 18 16:45:09.295478 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.295426 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c40b2ed4-792d-4afc-bc1a-3aad44ac26e0-metrics-tls\") pod \"dns-default-l45t6\" (UID: \"c40b2ed4-792d-4afc-bc1a-3aad44ac26e0\") " pod="openshift-dns/dns-default-l45t6" Mar 18 16:45:09.295478 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:09.295456 2536 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:09.295699 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.295506 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9mrx9\" (UniqueName: \"kubernetes.io/projected/c40b2ed4-792d-4afc-bc1a-3aad44ac26e0-kube-api-access-9mrx9\") pod \"dns-default-l45t6\" (UID: \"c40b2ed4-792d-4afc-bc1a-3aad44ac26e0\") " pod="openshift-dns/dns-default-l45t6" Mar 18 16:45:09.295699 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:09.295524 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f569b4c-304c-4347-827a-116204073ddf-cert podName:0f569b4c-304c-4347-827a-116204073ddf nodeName:}" failed. No retries permitted until 2026-03-18 16:45:09.795505548 +0000 UTC m=+32.569080129 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0f569b4c-304c-4347-827a-116204073ddf-cert") pod "ingress-canary-x862v" (UID: "0f569b4c-304c-4347-827a-116204073ddf") : secret "canary-serving-cert" not found Mar 18 16:45:09.295699 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:09.295688 2536 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:09.295811 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:09.295728 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c40b2ed4-792d-4afc-bc1a-3aad44ac26e0-metrics-tls podName:c40b2ed4-792d-4afc-bc1a-3aad44ac26e0 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:09.795715175 +0000 UTC m=+32.569289743 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c40b2ed4-792d-4afc-bc1a-3aad44ac26e0-metrics-tls") pod "dns-default-l45t6" (UID: "c40b2ed4-792d-4afc-bc1a-3aad44ac26e0") : secret "dns-default-metrics-tls" not found Mar 18 16:45:09.295901 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.295863 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c40b2ed4-792d-4afc-bc1a-3aad44ac26e0-tmp-dir\") pod \"dns-default-l45t6\" (UID: \"c40b2ed4-792d-4afc-bc1a-3aad44ac26e0\") " pod="openshift-dns/dns-default-l45t6" Mar 18 16:45:09.295901 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.295885 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c40b2ed4-792d-4afc-bc1a-3aad44ac26e0-config-volume\") pod \"dns-default-l45t6\" (UID: \"c40b2ed4-792d-4afc-bc1a-3aad44ac26e0\") " pod="openshift-dns/dns-default-l45t6" Mar 18 16:45:09.306096 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.306064 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mrx9\" (UniqueName: \"kubernetes.io/projected/c40b2ed4-792d-4afc-bc1a-3aad44ac26e0-kube-api-access-9mrx9\") pod \"dns-default-l45t6\" (UID: \"c40b2ed4-792d-4afc-bc1a-3aad44ac26e0\") " pod="openshift-dns/dns-default-l45t6" Mar 18 16:45:09.306200 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.306149 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7zpj\" (UniqueName: \"kubernetes.io/projected/0f569b4c-304c-4347-827a-116204073ddf-kube-api-access-h7zpj\") pod \"ingress-canary-x862v\" (UID: \"0f569b4c-304c-4347-827a-116204073ddf\") " pod="openshift-ingress-canary/ingress-canary-x862v" Mar 18 16:45:09.798627 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.798587 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c40b2ed4-792d-4afc-bc1a-3aad44ac26e0-metrics-tls\") pod \"dns-default-l45t6\" (UID: \"c40b2ed4-792d-4afc-bc1a-3aad44ac26e0\") " pod="openshift-dns/dns-default-l45t6" Mar 18 16:45:09.798784 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.798665 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f569b4c-304c-4347-827a-116204073ddf-cert\") pod \"ingress-canary-x862v\" (UID: \"0f569b4c-304c-4347-827a-116204073ddf\") " pod="openshift-ingress-canary/ingress-canary-x862v" Mar 18 16:45:09.798784 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:09.798737 2536 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:09.798871 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:09.798810 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c40b2ed4-792d-4afc-bc1a-3aad44ac26e0-metrics-tls podName:c40b2ed4-792d-4afc-bc1a-3aad44ac26e0 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:10.798793584 +0000 UTC m=+33.572368154 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c40b2ed4-792d-4afc-bc1a-3aad44ac26e0-metrics-tls") pod "dns-default-l45t6" (UID: "c40b2ed4-792d-4afc-bc1a-3aad44ac26e0") : secret "dns-default-metrics-tls" not found Mar 18 16:45:09.798871 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:09.798747 2536 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:09.798966 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:09.798879 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f569b4c-304c-4347-827a-116204073ddf-cert podName:0f569b4c-304c-4347-827a-116204073ddf nodeName:}" failed. No retries permitted until 2026-03-18 16:45:10.798863944 +0000 UTC m=+33.572438509 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0f569b4c-304c-4347-827a-116204073ddf-cert") pod "ingress-canary-x862v" (UID: "0f569b4c-304c-4347-827a-116204073ddf") : secret "canary-serving-cert" not found Mar 18 16:45:09.905358 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.905325 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr96r" Mar 18 16:45:09.905514 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.905325 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f52ql" Mar 18 16:45:09.905698 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.905325 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rjx6m" Mar 18 16:45:09.908359 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.908335 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-t48rq\"" Mar 18 16:45:09.908475 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.908380 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-w8h72\"" Mar 18 16:45:09.908475 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.908391 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Mar 18 16:45:09.908591 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.908342 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Mar 18 16:45:09.908643 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.908634 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Mar 18 16:45:09.908694 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:09.908684 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Mar 18 16:45:10.505932 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:10.505900 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b998ab02-161a-40e9-9e53-7183a98152de-original-pull-secret\") pod \"global-pull-secret-syncer-f52ql\" (UID: \"b998ab02-161a-40e9-9e53-7183a98152de\") " pod="kube-system/global-pull-secret-syncer-f52ql" Mar 18 16:45:10.506561 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:10.505947 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/275a2fa6-277f-40dc-a2bc-749a97550e2e-metrics-certs\") pod \"network-metrics-daemon-rjx6m\" (UID: \"275a2fa6-277f-40dc-a2bc-749a97550e2e\") " pod="openshift-multus/network-metrics-daemon-rjx6m" Mar 18 16:45:10.506561 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:10.506060 2536 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 18 16:45:10.506561 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:10.506110 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/275a2fa6-277f-40dc-a2bc-749a97550e2e-metrics-certs podName:275a2fa6-277f-40dc-a2bc-749a97550e2e nodeName:}" failed. No retries permitted until 2026-03-18 16:45:42.506096153 +0000 UTC m=+65.279670716 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/275a2fa6-277f-40dc-a2bc-749a97550e2e-metrics-certs") pod "network-metrics-daemon-rjx6m" (UID: "275a2fa6-277f-40dc-a2bc-749a97550e2e") : secret "metrics-daemon-secret" not found Mar 18 16:45:10.508392 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:10.508371 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b998ab02-161a-40e9-9e53-7183a98152de-original-pull-secret\") pod \"global-pull-secret-syncer-f52ql\" (UID: \"b998ab02-161a-40e9-9e53-7183a98152de\") " pod="kube-system/global-pull-secret-syncer-f52ql" Mar 18 16:45:10.524805 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:10.524783 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f52ql" Mar 18 16:45:10.708230 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:10.708193 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fldmr\" (UniqueName: \"kubernetes.io/projected/440da786-0ff1-4727-bd36-e32a3acc5a3c-kube-api-access-fldmr\") pod \"network-check-target-cr96r\" (UID: \"440da786-0ff1-4727-bd36-e32a3acc5a3c\") " pod="openshift-network-diagnostics/network-check-target-cr96r" Mar 18 16:45:10.710977 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:10.710948 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fldmr\" (UniqueName: \"kubernetes.io/projected/440da786-0ff1-4727-bd36-e32a3acc5a3c-kube-api-access-fldmr\") pod \"network-check-target-cr96r\" (UID: \"440da786-0ff1-4727-bd36-e32a3acc5a3c\") " pod="openshift-network-diagnostics/network-check-target-cr96r" Mar 18 16:45:10.808781 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:10.808714 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f569b4c-304c-4347-827a-116204073ddf-cert\") pod \"ingress-canary-x862v\" (UID: \"0f569b4c-304c-4347-827a-116204073ddf\") " pod="openshift-ingress-canary/ingress-canary-x862v" Mar 18 16:45:10.808781 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:10.808762 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c40b2ed4-792d-4afc-bc1a-3aad44ac26e0-metrics-tls\") pod \"dns-default-l45t6\" (UID: \"c40b2ed4-792d-4afc-bc1a-3aad44ac26e0\") " pod="openshift-dns/dns-default-l45t6" Mar 18 16:45:10.808967 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:10.808858 2536 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:10.808967 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:10.808922 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f569b4c-304c-4347-827a-116204073ddf-cert podName:0f569b4c-304c-4347-827a-116204073ddf nodeName:}" failed. No retries permitted until 2026-03-18 16:45:12.808905544 +0000 UTC m=+35.582480108 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0f569b4c-304c-4347-827a-116204073ddf-cert") pod "ingress-canary-x862v" (UID: "0f569b4c-304c-4347-827a-116204073ddf") : secret "canary-serving-cert" not found Mar 18 16:45:10.808967 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:10.808859 2536 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:10.809099 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:10.809002 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c40b2ed4-792d-4afc-bc1a-3aad44ac26e0-metrics-tls podName:c40b2ed4-792d-4afc-bc1a-3aad44ac26e0 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:12.808989651 +0000 UTC m=+35.582564233 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c40b2ed4-792d-4afc-bc1a-3aad44ac26e0-metrics-tls") pod "dns-default-l45t6" (UID: "c40b2ed4-792d-4afc-bc1a-3aad44ac26e0") : secret "dns-default-metrics-tls" not found Mar 18 16:45:10.817628 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:10.817605 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cr96r" Mar 18 16:45:11.129468 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:11.129303 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-cr96r"] Mar 18 16:45:11.129990 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:11.129972 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-f52ql"] Mar 18 16:45:11.133092 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:45:11.133068 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb998ab02_161a_40e9_9e53_7183a98152de.slice/crio-00b47abf0a0239e4ae2c5507e94f8f85389ca023130aecf74caa4c1e14e92bd3 WatchSource:0}: Error finding container 00b47abf0a0239e4ae2c5507e94f8f85389ca023130aecf74caa4c1e14e92bd3: Status 404 returned error can't find the container with id 00b47abf0a0239e4ae2c5507e94f8f85389ca023130aecf74caa4c1e14e92bd3 Mar 18 16:45:11.133407 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:45:11.133385 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod440da786_0ff1_4727_bd36_e32a3acc5a3c.slice/crio-351f88cce71939ef030c70e6ca0f0d1bbb255eecdcdedb4f06a163e4b815026b WatchSource:0}: Error finding container 351f88cce71939ef030c70e6ca0f0d1bbb255eecdcdedb4f06a163e4b815026b: Status 404 returned error can't find the container with id 351f88cce71939ef030c70e6ca0f0d1bbb255eecdcdedb4f06a163e4b815026b Mar 18 16:45:11.140994 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:11.140965 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-f52ql" event={"ID":"b998ab02-161a-40e9-9e53-7183a98152de","Type":"ContainerStarted","Data":"00b47abf0a0239e4ae2c5507e94f8f85389ca023130aecf74caa4c1e14e92bd3"} Mar 18 16:45:11.141814 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:11.141795 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-cr96r" event={"ID":"440da786-0ff1-4727-bd36-e32a3acc5a3c","Type":"ContainerStarted","Data":"351f88cce71939ef030c70e6ca0f0d1bbb255eecdcdedb4f06a163e4b815026b"} Mar 18 16:45:12.147834 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:12.147761 2536 generic.go:358] "Generic (PLEG): container finished" podID="96282abf-ce09-4b33-baaf-73f9c5329541" containerID="549d6f4fa9f7f9db92d902a81f26fb7ae64b2296ee715502f62a5dc740818bdd" exitCode=0 Mar 18 16:45:12.147834 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:12.147804 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d7xhk" event={"ID":"96282abf-ce09-4b33-baaf-73f9c5329541","Type":"ContainerDied","Data":"549d6f4fa9f7f9db92d902a81f26fb7ae64b2296ee715502f62a5dc740818bdd"} Mar 18 16:45:12.825598 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:12.825551 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f569b4c-304c-4347-827a-116204073ddf-cert\") pod \"ingress-canary-x862v\" (UID: \"0f569b4c-304c-4347-827a-116204073ddf\") " pod="openshift-ingress-canary/ingress-canary-x862v" Mar 18 16:45:12.825779 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:12.825609 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c40b2ed4-792d-4afc-bc1a-3aad44ac26e0-metrics-tls\") pod \"dns-default-l45t6\" (UID: \"c40b2ed4-792d-4afc-bc1a-3aad44ac26e0\") " pod="openshift-dns/dns-default-l45t6" Mar 18 16:45:12.825779 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:12.825665 2536 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:12.825779 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:12.825694 2536 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:12.825779 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:12.825724 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f569b4c-304c-4347-827a-116204073ddf-cert podName:0f569b4c-304c-4347-827a-116204073ddf nodeName:}" failed. No retries permitted until 2026-03-18 16:45:16.825706222 +0000 UTC m=+39.599280802 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0f569b4c-304c-4347-827a-116204073ddf-cert") pod "ingress-canary-x862v" (UID: "0f569b4c-304c-4347-827a-116204073ddf") : secret "canary-serving-cert" not found Mar 18 16:45:12.825779 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:12.825742 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c40b2ed4-792d-4afc-bc1a-3aad44ac26e0-metrics-tls podName:c40b2ed4-792d-4afc-bc1a-3aad44ac26e0 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:16.825733118 +0000 UTC m=+39.599307694 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c40b2ed4-792d-4afc-bc1a-3aad44ac26e0-metrics-tls") pod "dns-default-l45t6" (UID: "c40b2ed4-792d-4afc-bc1a-3aad44ac26e0") : secret "dns-default-metrics-tls" not found Mar 18 16:45:13.153938 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:13.153850 2536 generic.go:358] "Generic (PLEG): container finished" podID="96282abf-ce09-4b33-baaf-73f9c5329541" containerID="032298c1d384e343b6c529ad48907483b8bf58dccc8580a2653edf1d1c4160d4" exitCode=0 Mar 18 16:45:13.153938 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:13.153897 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d7xhk" event={"ID":"96282abf-ce09-4b33-baaf-73f9c5329541","Type":"ContainerDied","Data":"032298c1d384e343b6c529ad48907483b8bf58dccc8580a2653edf1d1c4160d4"} Mar 18 16:45:14.159000 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:14.158812 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d7xhk" event={"ID":"96282abf-ce09-4b33-baaf-73f9c5329541","Type":"ContainerStarted","Data":"4051af5ae3f3f48dd384c511119ad80f87f9344a45422122701d4947fdf200bc"} Mar 18 16:45:14.181219 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:14.181169 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-d7xhk" podStartSLOduration=4.05772605 podStartE2EDuration="36.18115244s" podCreationTimestamp="2026-03-18 16:44:38 +0000 UTC" firstStartedPulling="2026-03-18 16:44:39.191693056 +0000 UTC m=+1.965267620" lastFinishedPulling="2026-03-18 16:45:11.315119443 +0000 UTC m=+34.088694010" observedRunningTime="2026-03-18 16:45:14.179962073 +0000 UTC m=+36.953536671" watchObservedRunningTime="2026-03-18 16:45:14.18115244 +0000 UTC m=+36.954727026" Mar 18 16:45:16.854196 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:16.854148 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f569b4c-304c-4347-827a-116204073ddf-cert\") pod \"ingress-canary-x862v\" (UID: \"0f569b4c-304c-4347-827a-116204073ddf\") " pod="openshift-ingress-canary/ingress-canary-x862v" Mar 18 16:45:16.854690 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:16.854211 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c40b2ed4-792d-4afc-bc1a-3aad44ac26e0-metrics-tls\") pod \"dns-default-l45t6\" (UID: \"c40b2ed4-792d-4afc-bc1a-3aad44ac26e0\") " pod="openshift-dns/dns-default-l45t6" Mar 18 16:45:16.854690 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:16.854320 2536 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:16.854690 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:16.854378 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f569b4c-304c-4347-827a-116204073ddf-cert podName:0f569b4c-304c-4347-827a-116204073ddf nodeName:}" failed. No retries permitted until 2026-03-18 16:45:24.854363022 +0000 UTC m=+47.627937592 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0f569b4c-304c-4347-827a-116204073ddf-cert") pod "ingress-canary-x862v" (UID: "0f569b4c-304c-4347-827a-116204073ddf") : secret "canary-serving-cert" not found Mar 18 16:45:16.854690 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:16.854320 2536 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:16.854690 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:16.854438 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c40b2ed4-792d-4afc-bc1a-3aad44ac26e0-metrics-tls podName:c40b2ed4-792d-4afc-bc1a-3aad44ac26e0 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:24.854427224 +0000 UTC m=+47.628001790 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c40b2ed4-792d-4afc-bc1a-3aad44ac26e0-metrics-tls") pod "dns-default-l45t6" (UID: "c40b2ed4-792d-4afc-bc1a-3aad44ac26e0") : secret "dns-default-metrics-tls" not found Mar 18 16:45:17.166602 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:17.166509 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-cr96r" event={"ID":"440da786-0ff1-4727-bd36-e32a3acc5a3c","Type":"ContainerStarted","Data":"0a4e95c3c2e199e3881244716fd4c602523a35a161e50241dcf0c5375b1c3258"} Mar 18 16:45:17.166776 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:17.166623 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-cr96r" Mar 18 16:45:17.167768 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:17.167747 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-f52ql" event={"ID":"b998ab02-161a-40e9-9e53-7183a98152de","Type":"ContainerStarted","Data":"f6843969afef119b6a341ef4c0ee11c8defaaecb10cc7a08a4f6ec133878dca1"} Mar 18 16:45:17.181407 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:17.181358 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-cr96r" podStartSLOduration=34.22188516 podStartE2EDuration="39.181343516s" podCreationTimestamp="2026-03-18 16:44:38 +0000 UTC" firstStartedPulling="2026-03-18 16:45:11.293654122 +0000 UTC m=+34.067228690" lastFinishedPulling="2026-03-18 16:45:16.253112475 +0000 UTC m=+39.026687046" observedRunningTime="2026-03-18 16:45:17.181140803 +0000 UTC m=+39.954715388" watchObservedRunningTime="2026-03-18 16:45:17.181343516 +0000 UTC m=+39.954918103" Mar 18 16:45:17.195548 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:17.195498 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-f52ql" podStartSLOduration=34.225424037 podStartE2EDuration="39.195486005s" podCreationTimestamp="2026-03-18 16:44:38 +0000 UTC" firstStartedPulling="2026-03-18 16:45:11.293771298 +0000 UTC m=+34.067345865" lastFinishedPulling="2026-03-18 16:45:16.263833269 +0000 UTC m=+39.037407833" observedRunningTime="2026-03-18 16:45:17.194959817 +0000 UTC m=+39.968534403" watchObservedRunningTime="2026-03-18 16:45:17.195486005 +0000 UTC m=+39.969060592" Mar 18 16:45:24.907038 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:24.907000 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f569b4c-304c-4347-827a-116204073ddf-cert\") pod \"ingress-canary-x862v\" (UID: \"0f569b4c-304c-4347-827a-116204073ddf\") " pod="openshift-ingress-canary/ingress-canary-x862v" Mar 18 16:45:24.907038 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:24.907045 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c40b2ed4-792d-4afc-bc1a-3aad44ac26e0-metrics-tls\") pod \"dns-default-l45t6\" (UID: \"c40b2ed4-792d-4afc-bc1a-3aad44ac26e0\") " pod="openshift-dns/dns-default-l45t6" Mar 18 16:45:24.907553 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:24.907135 2536 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:24.907553 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:24.907137 2536 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:24.907553 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:24.907185 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c40b2ed4-792d-4afc-bc1a-3aad44ac26e0-metrics-tls podName:c40b2ed4-792d-4afc-bc1a-3aad44ac26e0 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:40.907171482 +0000 UTC m=+63.680746046 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c40b2ed4-792d-4afc-bc1a-3aad44ac26e0-metrics-tls") pod "dns-default-l45t6" (UID: "c40b2ed4-792d-4afc-bc1a-3aad44ac26e0") : secret "dns-default-metrics-tls" not found Mar 18 16:45:24.907553 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:24.907198 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f569b4c-304c-4347-827a-116204073ddf-cert podName:0f569b4c-304c-4347-827a-116204073ddf nodeName:}" failed. No retries permitted until 2026-03-18 16:45:40.90719263 +0000 UTC m=+63.680767193 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0f569b4c-304c-4347-827a-116204073ddf-cert") pod "ingress-canary-x862v" (UID: "0f569b4c-304c-4347-827a-116204073ddf") : secret "canary-serving-cert" not found Mar 18 16:45:35.147087 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:35.147058 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ts4mw" Mar 18 16:45:41.000938 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:41.000902 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f569b4c-304c-4347-827a-116204073ddf-cert\") pod \"ingress-canary-x862v\" (UID: \"0f569b4c-304c-4347-827a-116204073ddf\") " pod="openshift-ingress-canary/ingress-canary-x862v" Mar 18 16:45:41.000938 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:41.000945 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c40b2ed4-792d-4afc-bc1a-3aad44ac26e0-metrics-tls\") pod \"dns-default-l45t6\" (UID: \"c40b2ed4-792d-4afc-bc1a-3aad44ac26e0\") " pod="openshift-dns/dns-default-l45t6" Mar 18 16:45:41.001470 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:41.001036 2536 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:41.001470 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:41.001091 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c40b2ed4-792d-4afc-bc1a-3aad44ac26e0-metrics-tls podName:c40b2ed4-792d-4afc-bc1a-3aad44ac26e0 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:13.001076048 +0000 UTC m=+95.774650623 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c40b2ed4-792d-4afc-bc1a-3aad44ac26e0-metrics-tls") pod "dns-default-l45t6" (UID: "c40b2ed4-792d-4afc-bc1a-3aad44ac26e0") : secret "dns-default-metrics-tls" not found Mar 18 16:45:41.001470 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:41.001036 2536 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:41.001470 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:41.001163 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f569b4c-304c-4347-827a-116204073ddf-cert podName:0f569b4c-304c-4347-827a-116204073ddf nodeName:}" failed. No retries permitted until 2026-03-18 16:46:13.001148887 +0000 UTC m=+95.774723450 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0f569b4c-304c-4347-827a-116204073ddf-cert") pod "ingress-canary-x862v" (UID: "0f569b4c-304c-4347-827a-116204073ddf") : secret "canary-serving-cert" not found Mar 18 16:45:42.508140 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:42.508092 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/275a2fa6-277f-40dc-a2bc-749a97550e2e-metrics-certs\") pod \"network-metrics-daemon-rjx6m\" (UID: \"275a2fa6-277f-40dc-a2bc-749a97550e2e\") " pod="openshift-multus/network-metrics-daemon-rjx6m" Mar 18 16:45:42.508506 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:42.508232 2536 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 18 16:45:42.508506 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:45:42.508320 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/275a2fa6-277f-40dc-a2bc-749a97550e2e-metrics-certs podName:275a2fa6-277f-40dc-a2bc-749a97550e2e nodeName:}" failed. No retries permitted until 2026-03-18 16:46:46.508303776 +0000 UTC m=+129.281878341 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/275a2fa6-277f-40dc-a2bc-749a97550e2e-metrics-certs") pod "network-metrics-daemon-rjx6m" (UID: "275a2fa6-277f-40dc-a2bc-749a97550e2e") : secret "metrics-daemon-secret" not found Mar 18 16:45:48.172676 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:45:48.172644 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-cr96r" Mar 18 16:46:13.006778 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:13.006665 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f569b4c-304c-4347-827a-116204073ddf-cert\") pod \"ingress-canary-x862v\" (UID: \"0f569b4c-304c-4347-827a-116204073ddf\") " pod="openshift-ingress-canary/ingress-canary-x862v" Mar 18 16:46:13.006778 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:13.006705 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c40b2ed4-792d-4afc-bc1a-3aad44ac26e0-metrics-tls\") pod \"dns-default-l45t6\" (UID: \"c40b2ed4-792d-4afc-bc1a-3aad44ac26e0\") " pod="openshift-dns/dns-default-l45t6" Mar 18 16:46:13.007327 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:46:13.006816 2536 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:46:13.007327 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:46:13.006903 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f569b4c-304c-4347-827a-116204073ddf-cert podName:0f569b4c-304c-4347-827a-116204073ddf nodeName:}" failed. No retries permitted until 2026-03-18 16:47:17.006885831 +0000 UTC m=+159.780460399 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0f569b4c-304c-4347-827a-116204073ddf-cert") pod "ingress-canary-x862v" (UID: "0f569b4c-304c-4347-827a-116204073ddf") : secret "canary-serving-cert" not found Mar 18 16:46:13.007327 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:46:13.006816 2536 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:46:13.007327 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:46:13.006955 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c40b2ed4-792d-4afc-bc1a-3aad44ac26e0-metrics-tls podName:c40b2ed4-792d-4afc-bc1a-3aad44ac26e0 nodeName:}" failed. No retries permitted until 2026-03-18 16:47:17.006944517 +0000 UTC m=+159.780519082 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c40b2ed4-792d-4afc-bc1a-3aad44ac26e0-metrics-tls") pod "dns-default-l45t6" (UID: "c40b2ed4-792d-4afc-bc1a-3aad44ac26e0") : secret "dns-default-metrics-tls" not found Mar 18 16:46:46.527388 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:46.527351 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/275a2fa6-277f-40dc-a2bc-749a97550e2e-metrics-certs\") pod \"network-metrics-daemon-rjx6m\" (UID: \"275a2fa6-277f-40dc-a2bc-749a97550e2e\") " pod="openshift-multus/network-metrics-daemon-rjx6m" Mar 18 16:46:46.527856 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:46:46.527476 2536 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 18 16:46:46.527856 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:46:46.527540 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/275a2fa6-277f-40dc-a2bc-749a97550e2e-metrics-certs podName:275a2fa6-277f-40dc-a2bc-749a97550e2e nodeName:}" failed. No retries permitted until 2026-03-18 16:48:48.527525628 +0000 UTC m=+251.301100191 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/275a2fa6-277f-40dc-a2bc-749a97550e2e-metrics-certs") pod "network-metrics-daemon-rjx6m" (UID: "275a2fa6-277f-40dc-a2bc-749a97550e2e") : secret "metrics-daemon-secret" not found Mar 18 16:46:49.253485 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.253450 2536 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-cc88fdd44-2r9d4"] Mar 18 16:46:49.256228 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.256211 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-cc88fdd44-2r9d4" Mar 18 16:46:49.258615 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.258597 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-5kbzd\"" Mar 18 16:46:49.272764 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.272730 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-cc88fdd44-2r9d4"] Mar 18 16:46:49.345890 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.345862 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jmxb\" (UniqueName: \"kubernetes.io/projected/cfac9d9b-1748-485d-8093-c404ac8d2d3d-kube-api-access-5jmxb\") pod \"network-check-source-cc88fdd44-2r9d4\" (UID: \"cfac9d9b-1748-485d-8093-c404ac8d2d3d\") " pod="openshift-network-diagnostics/network-check-source-cc88fdd44-2r9d4" Mar 18 16:46:49.353859 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.353832 2536 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-76b8565867-mx9sz"] Mar 18 16:46:49.356601 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.356584 2536 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-559fbc86fb-nz727"] Mar 18 16:46:49.356732 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.356715 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-76b8565867-mx9sz" Mar 18 16:46:49.359209 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.359186 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-559fbc86fb-nz727" Mar 18 16:46:49.359305 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.359290 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:46:49.359411 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.359376 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-nqslq\"" Mar 18 16:46:49.359687 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.359670 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Mar 18 16:46:49.359779 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.359759 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Mar 18 16:46:49.359885 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.359872 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Mar 18 16:46:49.361678 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.361660 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Mar 18 16:46:49.362088 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.362071 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Mar 18 16:46:49.362191 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.362101 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Mar 18 16:46:49.362191 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.362160 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Mar 18 16:46:49.362313 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.362217 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Mar 18 16:46:49.362464 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.362446 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Mar 18 16:46:49.363009 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.362992 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-lw8kp\"" Mar 18 16:46:49.370832 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.370813 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Mar 18 16:46:49.371057 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.371037 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-76b8565867-mx9sz"] Mar 18 16:46:49.371738 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.371720 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-559fbc86fb-nz727"] Mar 18 16:46:49.446548 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.446515 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pdjd\" (UniqueName: \"kubernetes.io/projected/95e73a3f-8a85-403f-b00b-17524a80b500-kube-api-access-8pdjd\") pod \"console-operator-76b8565867-mx9sz\" (UID: \"95e73a3f-8a85-403f-b00b-17524a80b500\") " pod="openshift-console-operator/console-operator-76b8565867-mx9sz" Mar 18 16:46:49.446548 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.446552 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/95e73a3f-8a85-403f-b00b-17524a80b500-trusted-ca\") pod \"console-operator-76b8565867-mx9sz\" (UID: \"95e73a3f-8a85-403f-b00b-17524a80b500\") " pod="openshift-console-operator/console-operator-76b8565867-mx9sz" Mar 18 16:46:49.446798 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.446572 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2aea10b4-ea4d-46ca-a2c5-159a563fc276-service-ca-bundle\") pod \"router-default-559fbc86fb-nz727\" (UID: \"2aea10b4-ea4d-46ca-a2c5-159a563fc276\") " pod="openshift-ingress/router-default-559fbc86fb-nz727" Mar 18 16:46:49.446798 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.446598 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2aea10b4-ea4d-46ca-a2c5-159a563fc276-metrics-certs\") pod \"router-default-559fbc86fb-nz727\" (UID: \"2aea10b4-ea4d-46ca-a2c5-159a563fc276\") " pod="openshift-ingress/router-default-559fbc86fb-nz727" Mar 18 16:46:49.446798 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.446672 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnv5z\" (UniqueName: \"kubernetes.io/projected/2aea10b4-ea4d-46ca-a2c5-159a563fc276-kube-api-access-cnv5z\") pod \"router-default-559fbc86fb-nz727\" (UID: \"2aea10b4-ea4d-46ca-a2c5-159a563fc276\") " pod="openshift-ingress/router-default-559fbc86fb-nz727" Mar 18 16:46:49.446798 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.446720 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95e73a3f-8a85-403f-b00b-17524a80b500-config\") pod \"console-operator-76b8565867-mx9sz\" (UID: \"95e73a3f-8a85-403f-b00b-17524a80b500\") " pod="openshift-console-operator/console-operator-76b8565867-mx9sz" Mar 18 16:46:49.446798 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.446749 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2aea10b4-ea4d-46ca-a2c5-159a563fc276-default-certificate\") pod \"router-default-559fbc86fb-nz727\" (UID: \"2aea10b4-ea4d-46ca-a2c5-159a563fc276\") " pod="openshift-ingress/router-default-559fbc86fb-nz727" Mar 18 16:46:49.446798 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.446788 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5jmxb\" (UniqueName: \"kubernetes.io/projected/cfac9d9b-1748-485d-8093-c404ac8d2d3d-kube-api-access-5jmxb\") pod \"network-check-source-cc88fdd44-2r9d4\" (UID: \"cfac9d9b-1748-485d-8093-c404ac8d2d3d\") " pod="openshift-network-diagnostics/network-check-source-cc88fdd44-2r9d4" Mar 18 16:46:49.447018 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.446814 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95e73a3f-8a85-403f-b00b-17524a80b500-serving-cert\") pod \"console-operator-76b8565867-mx9sz\" (UID: \"95e73a3f-8a85-403f-b00b-17524a80b500\") " pod="openshift-console-operator/console-operator-76b8565867-mx9sz" Mar 18 16:46:49.447018 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.446869 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2aea10b4-ea4d-46ca-a2c5-159a563fc276-stats-auth\") pod \"router-default-559fbc86fb-nz727\" (UID: \"2aea10b4-ea4d-46ca-a2c5-159a563fc276\") " pod="openshift-ingress/router-default-559fbc86fb-nz727" Mar 18 16:46:49.467443 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.467421 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jmxb\" (UniqueName: \"kubernetes.io/projected/cfac9d9b-1748-485d-8093-c404ac8d2d3d-kube-api-access-5jmxb\") pod \"network-check-source-cc88fdd44-2r9d4\" (UID: \"cfac9d9b-1748-485d-8093-c404ac8d2d3d\") " pod="openshift-network-diagnostics/network-check-source-cc88fdd44-2r9d4" Mar 18 16:46:49.547742 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.547647 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2aea10b4-ea4d-46ca-a2c5-159a563fc276-stats-auth\") pod \"router-default-559fbc86fb-nz727\" (UID: \"2aea10b4-ea4d-46ca-a2c5-159a563fc276\") " pod="openshift-ingress/router-default-559fbc86fb-nz727" Mar 18 16:46:49.547742 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.547709 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8pdjd\" (UniqueName: \"kubernetes.io/projected/95e73a3f-8a85-403f-b00b-17524a80b500-kube-api-access-8pdjd\") pod \"console-operator-76b8565867-mx9sz\" (UID: \"95e73a3f-8a85-403f-b00b-17524a80b500\") " pod="openshift-console-operator/console-operator-76b8565867-mx9sz" Mar 18 16:46:49.547742 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.547740 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/95e73a3f-8a85-403f-b00b-17524a80b500-trusted-ca\") pod \"console-operator-76b8565867-mx9sz\" (UID: \"95e73a3f-8a85-403f-b00b-17524a80b500\") " pod="openshift-console-operator/console-operator-76b8565867-mx9sz" Mar 18 16:46:49.548013 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.547767 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2aea10b4-ea4d-46ca-a2c5-159a563fc276-service-ca-bundle\") pod \"router-default-559fbc86fb-nz727\" (UID: \"2aea10b4-ea4d-46ca-a2c5-159a563fc276\") " pod="openshift-ingress/router-default-559fbc86fb-nz727" Mar 18 16:46:49.548013 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.547804 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2aea10b4-ea4d-46ca-a2c5-159a563fc276-metrics-certs\") pod \"router-default-559fbc86fb-nz727\" (UID: \"2aea10b4-ea4d-46ca-a2c5-159a563fc276\") " pod="openshift-ingress/router-default-559fbc86fb-nz727" Mar 18 16:46:49.548013 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.547829 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cnv5z\" (UniqueName: \"kubernetes.io/projected/2aea10b4-ea4d-46ca-a2c5-159a563fc276-kube-api-access-cnv5z\") pod \"router-default-559fbc86fb-nz727\" (UID: \"2aea10b4-ea4d-46ca-a2c5-159a563fc276\") " pod="openshift-ingress/router-default-559fbc86fb-nz727" Mar 18 16:46:49.548013 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.547862 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95e73a3f-8a85-403f-b00b-17524a80b500-config\") pod \"console-operator-76b8565867-mx9sz\" (UID: \"95e73a3f-8a85-403f-b00b-17524a80b500\") " pod="openshift-console-operator/console-operator-76b8565867-mx9sz" Mar 18 16:46:49.548013 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.547896 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2aea10b4-ea4d-46ca-a2c5-159a563fc276-default-certificate\") pod \"router-default-559fbc86fb-nz727\" (UID: \"2aea10b4-ea4d-46ca-a2c5-159a563fc276\") " pod="openshift-ingress/router-default-559fbc86fb-nz727" Mar 18 16:46:49.548013 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:46:49.547921 2536 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Mar 18 16:46:49.548013 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.547926 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95e73a3f-8a85-403f-b00b-17524a80b500-serving-cert\") pod \"console-operator-76b8565867-mx9sz\" (UID: \"95e73a3f-8a85-403f-b00b-17524a80b500\") " pod="openshift-console-operator/console-operator-76b8565867-mx9sz" Mar 18 16:46:49.548013 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:46:49.547971 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2aea10b4-ea4d-46ca-a2c5-159a563fc276-service-ca-bundle podName:2aea10b4-ea4d-46ca-a2c5-159a563fc276 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:50.047943778 +0000 UTC m=+132.821518356 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/2aea10b4-ea4d-46ca-a2c5-159a563fc276-service-ca-bundle") pod "router-default-559fbc86fb-nz727" (UID: "2aea10b4-ea4d-46ca-a2c5-159a563fc276") : configmap references non-existent config key: service-ca.crt Mar 18 16:46:49.548013 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:46:49.548006 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2aea10b4-ea4d-46ca-a2c5-159a563fc276-metrics-certs podName:2aea10b4-ea4d-46ca-a2c5-159a563fc276 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:50.047989594 +0000 UTC m=+132.821564158 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2aea10b4-ea4d-46ca-a2c5-159a563fc276-metrics-certs") pod "router-default-559fbc86fb-nz727" (UID: "2aea10b4-ea4d-46ca-a2c5-159a563fc276") : secret "router-metrics-certs-default" not found Mar 18 16:46:49.548863 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.548839 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95e73a3f-8a85-403f-b00b-17524a80b500-config\") pod \"console-operator-76b8565867-mx9sz\" (UID: \"95e73a3f-8a85-403f-b00b-17524a80b500\") " pod="openshift-console-operator/console-operator-76b8565867-mx9sz" Mar 18 16:46:49.549182 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.549159 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/95e73a3f-8a85-403f-b00b-17524a80b500-trusted-ca\") pod \"console-operator-76b8565867-mx9sz\" (UID: \"95e73a3f-8a85-403f-b00b-17524a80b500\") " pod="openshift-console-operator/console-operator-76b8565867-mx9sz" Mar 18 16:46:49.550284 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.550247 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95e73a3f-8a85-403f-b00b-17524a80b500-serving-cert\") pod \"console-operator-76b8565867-mx9sz\" (UID: \"95e73a3f-8a85-403f-b00b-17524a80b500\") " pod="openshift-console-operator/console-operator-76b8565867-mx9sz" Mar 18 16:46:49.550639 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.550620 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2aea10b4-ea4d-46ca-a2c5-159a563fc276-default-certificate\") pod \"router-default-559fbc86fb-nz727\" (UID: \"2aea10b4-ea4d-46ca-a2c5-159a563fc276\") " pod="openshift-ingress/router-default-559fbc86fb-nz727" Mar 18 16:46:49.550719 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.550674 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2aea10b4-ea4d-46ca-a2c5-159a563fc276-stats-auth\") pod \"router-default-559fbc86fb-nz727\" (UID: \"2aea10b4-ea4d-46ca-a2c5-159a563fc276\") " pod="openshift-ingress/router-default-559fbc86fb-nz727" Mar 18 16:46:49.555964 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.555942 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnv5z\" (UniqueName: \"kubernetes.io/projected/2aea10b4-ea4d-46ca-a2c5-159a563fc276-kube-api-access-cnv5z\") pod \"router-default-559fbc86fb-nz727\" (UID: \"2aea10b4-ea4d-46ca-a2c5-159a563fc276\") " pod="openshift-ingress/router-default-559fbc86fb-nz727" Mar 18 16:46:49.556413 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.556396 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pdjd\" (UniqueName: \"kubernetes.io/projected/95e73a3f-8a85-403f-b00b-17524a80b500-kube-api-access-8pdjd\") pod \"console-operator-76b8565867-mx9sz\" (UID: \"95e73a3f-8a85-403f-b00b-17524a80b500\") " pod="openshift-console-operator/console-operator-76b8565867-mx9sz" Mar 18 16:46:49.564882 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.564865 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-cc88fdd44-2r9d4" Mar 18 16:46:49.665848 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.665817 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-76b8565867-mx9sz" Mar 18 16:46:49.680686 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.680662 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-cc88fdd44-2r9d4"] Mar 18 16:46:49.683371 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:46:49.683341 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfac9d9b_1748_485d_8093_c404ac8d2d3d.slice/crio-3cff795f65fbe5218ebc4e19d1516f52ca7a5a73f8ddcc853db5225dcf519453 WatchSource:0}: Error finding container 3cff795f65fbe5218ebc4e19d1516f52ca7a5a73f8ddcc853db5225dcf519453: Status 404 returned error can't find the container with id 3cff795f65fbe5218ebc4e19d1516f52ca7a5a73f8ddcc853db5225dcf519453 Mar 18 16:46:49.782182 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:49.781944 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-76b8565867-mx9sz"] Mar 18 16:46:49.785725 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:46:49.785701 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95e73a3f_8a85_403f_b00b_17524a80b500.slice/crio-39d341f8b3c6f3a90caa04abbe250658a5a5f1d1b931d4e6b03baf63ab8af2c6 WatchSource:0}: Error finding container 39d341f8b3c6f3a90caa04abbe250658a5a5f1d1b931d4e6b03baf63ab8af2c6: Status 404 returned error can't find the container with id 39d341f8b3c6f3a90caa04abbe250658a5a5f1d1b931d4e6b03baf63ab8af2c6 Mar 18 16:46:50.051277 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:50.051245 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2aea10b4-ea4d-46ca-a2c5-159a563fc276-service-ca-bundle\") pod \"router-default-559fbc86fb-nz727\" (UID: \"2aea10b4-ea4d-46ca-a2c5-159a563fc276\") " pod="openshift-ingress/router-default-559fbc86fb-nz727" Mar 18 16:46:50.051481 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:50.051310 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2aea10b4-ea4d-46ca-a2c5-159a563fc276-metrics-certs\") pod \"router-default-559fbc86fb-nz727\" (UID: \"2aea10b4-ea4d-46ca-a2c5-159a563fc276\") " pod="openshift-ingress/router-default-559fbc86fb-nz727" Mar 18 16:46:50.051481 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:46:50.051438 2536 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Mar 18 16:46:50.051481 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:46:50.051440 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2aea10b4-ea4d-46ca-a2c5-159a563fc276-service-ca-bundle podName:2aea10b4-ea4d-46ca-a2c5-159a563fc276 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:51.051419088 +0000 UTC m=+133.824993656 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/2aea10b4-ea4d-46ca-a2c5-159a563fc276-service-ca-bundle") pod "router-default-559fbc86fb-nz727" (UID: "2aea10b4-ea4d-46ca-a2c5-159a563fc276") : configmap references non-existent config key: service-ca.crt Mar 18 16:46:50.051655 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:46:50.051494 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2aea10b4-ea4d-46ca-a2c5-159a563fc276-metrics-certs podName:2aea10b4-ea4d-46ca-a2c5-159a563fc276 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:51.051477407 +0000 UTC m=+133.825051975 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2aea10b4-ea4d-46ca-a2c5-159a563fc276-metrics-certs") pod "router-default-559fbc86fb-nz727" (UID: "2aea10b4-ea4d-46ca-a2c5-159a563fc276") : secret "router-metrics-certs-default" not found Mar 18 16:46:50.352819 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:50.352731 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-76b8565867-mx9sz" event={"ID":"95e73a3f-8a85-403f-b00b-17524a80b500","Type":"ContainerStarted","Data":"39d341f8b3c6f3a90caa04abbe250658a5a5f1d1b931d4e6b03baf63ab8af2c6"} Mar 18 16:46:50.354127 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:50.354095 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-cc88fdd44-2r9d4" event={"ID":"cfac9d9b-1748-485d-8093-c404ac8d2d3d","Type":"ContainerStarted","Data":"35e4205cd4c07dc8c7a9de3352a206244d648f56879af8647b7ee8b90cd21453"} Mar 18 16:46:50.354127 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:50.354127 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-cc88fdd44-2r9d4" event={"ID":"cfac9d9b-1748-485d-8093-c404ac8d2d3d","Type":"ContainerStarted","Data":"3cff795f65fbe5218ebc4e19d1516f52ca7a5a73f8ddcc853db5225dcf519453"} Mar 18 16:46:50.370401 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:50.370353 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-cc88fdd44-2r9d4" podStartSLOduration=1.370337125 podStartE2EDuration="1.370337125s" podCreationTimestamp="2026-03-18 16:46:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:46:50.369717807 +0000 UTC m=+133.143292394" watchObservedRunningTime="2026-03-18 16:46:50.370337125 +0000 UTC m=+133.143911714" Mar 18 16:46:51.060908 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:51.060859 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2aea10b4-ea4d-46ca-a2c5-159a563fc276-service-ca-bundle\") pod \"router-default-559fbc86fb-nz727\" (UID: \"2aea10b4-ea4d-46ca-a2c5-159a563fc276\") " pod="openshift-ingress/router-default-559fbc86fb-nz727" Mar 18 16:46:51.060908 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:51.060916 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2aea10b4-ea4d-46ca-a2c5-159a563fc276-metrics-certs\") pod \"router-default-559fbc86fb-nz727\" (UID: \"2aea10b4-ea4d-46ca-a2c5-159a563fc276\") " pod="openshift-ingress/router-default-559fbc86fb-nz727" Mar 18 16:46:51.061154 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:46:51.061056 2536 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Mar 18 16:46:51.061154 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:46:51.061062 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2aea10b4-ea4d-46ca-a2c5-159a563fc276-service-ca-bundle podName:2aea10b4-ea4d-46ca-a2c5-159a563fc276 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:53.061039765 +0000 UTC m=+135.834614352 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/2aea10b4-ea4d-46ca-a2c5-159a563fc276-service-ca-bundle") pod "router-default-559fbc86fb-nz727" (UID: "2aea10b4-ea4d-46ca-a2c5-159a563fc276") : configmap references non-existent config key: service-ca.crt Mar 18 16:46:51.061154 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:46:51.061131 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2aea10b4-ea4d-46ca-a2c5-159a563fc276-metrics-certs podName:2aea10b4-ea4d-46ca-a2c5-159a563fc276 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:53.061114373 +0000 UTC m=+135.834688942 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2aea10b4-ea4d-46ca-a2c5-159a563fc276-metrics-certs") pod "router-default-559fbc86fb-nz727" (UID: "2aea10b4-ea4d-46ca-a2c5-159a563fc276") : secret "router-metrics-certs-default" not found Mar 18 16:46:52.359658 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:52.359624 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-mx9sz_95e73a3f-8a85-403f-b00b-17524a80b500/console-operator/0.log" Mar 18 16:46:52.360013 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:52.359669 2536 generic.go:358] "Generic (PLEG): container finished" podID="95e73a3f-8a85-403f-b00b-17524a80b500" containerID="708b515d94e8871e67b98f48c964709fec19c2b083775027cb597939b42ccf54" exitCode=255 Mar 18 16:46:52.360013 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:52.359704 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-76b8565867-mx9sz" event={"ID":"95e73a3f-8a85-403f-b00b-17524a80b500","Type":"ContainerDied","Data":"708b515d94e8871e67b98f48c964709fec19c2b083775027cb597939b42ccf54"} Mar 18 16:46:52.360013 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:52.359928 2536 scope.go:117] "RemoveContainer" containerID="708b515d94e8871e67b98f48c964709fec19c2b083775027cb597939b42ccf54" Mar 18 16:46:53.077624 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:53.077561 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2aea10b4-ea4d-46ca-a2c5-159a563fc276-metrics-certs\") pod \"router-default-559fbc86fb-nz727\" (UID: \"2aea10b4-ea4d-46ca-a2c5-159a563fc276\") " pod="openshift-ingress/router-default-559fbc86fb-nz727" Mar 18 16:46:53.077796 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:53.077714 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2aea10b4-ea4d-46ca-a2c5-159a563fc276-service-ca-bundle\") pod \"router-default-559fbc86fb-nz727\" (UID: \"2aea10b4-ea4d-46ca-a2c5-159a563fc276\") " pod="openshift-ingress/router-default-559fbc86fb-nz727" Mar 18 16:46:53.077796 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:46:53.077717 2536 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Mar 18 16:46:53.077796 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:46:53.077786 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2aea10b4-ea4d-46ca-a2c5-159a563fc276-metrics-certs podName:2aea10b4-ea4d-46ca-a2c5-159a563fc276 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:57.077769424 +0000 UTC m=+139.851343988 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2aea10b4-ea4d-46ca-a2c5-159a563fc276-metrics-certs") pod "router-default-559fbc86fb-nz727" (UID: "2aea10b4-ea4d-46ca-a2c5-159a563fc276") : secret "router-metrics-certs-default" not found Mar 18 16:46:53.077907 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:46:53.077824 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2aea10b4-ea4d-46ca-a2c5-159a563fc276-service-ca-bundle podName:2aea10b4-ea4d-46ca-a2c5-159a563fc276 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:57.077809584 +0000 UTC m=+139.851384148 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/2aea10b4-ea4d-46ca-a2c5-159a563fc276-service-ca-bundle") pod "router-default-559fbc86fb-nz727" (UID: "2aea10b4-ea4d-46ca-a2c5-159a563fc276") : configmap references non-existent config key: service-ca.crt Mar 18 16:46:53.363387 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:53.363313 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-mx9sz_95e73a3f-8a85-403f-b00b-17524a80b500/console-operator/1.log" Mar 18 16:46:53.363727 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:53.363709 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-mx9sz_95e73a3f-8a85-403f-b00b-17524a80b500/console-operator/0.log" Mar 18 16:46:53.363773 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:53.363738 2536 generic.go:358] "Generic (PLEG): container finished" podID="95e73a3f-8a85-403f-b00b-17524a80b500" containerID="499d529c823274a1c09941c2964e22f458c3917cf0c8bc717732ced3875663b4" exitCode=255 Mar 18 16:46:53.363803 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:53.363769 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-76b8565867-mx9sz" event={"ID":"95e73a3f-8a85-403f-b00b-17524a80b500","Type":"ContainerDied","Data":"499d529c823274a1c09941c2964e22f458c3917cf0c8bc717732ced3875663b4"} Mar 18 16:46:53.363842 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:53.363815 2536 scope.go:117] "RemoveContainer" containerID="708b515d94e8871e67b98f48c964709fec19c2b083775027cb597939b42ccf54" Mar 18 16:46:53.364097 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:53.364074 2536 scope.go:117] "RemoveContainer" containerID="499d529c823274a1c09941c2964e22f458c3917cf0c8bc717732ced3875663b4" Mar 18 16:46:53.364300 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:46:53.364258 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-76b8565867-mx9sz_openshift-console-operator(95e73a3f-8a85-403f-b00b-17524a80b500)\"" pod="openshift-console-operator/console-operator-76b8565867-mx9sz" podUID="95e73a3f-8a85-403f-b00b-17524a80b500" Mar 18 16:46:54.366706 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:54.366680 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-mx9sz_95e73a3f-8a85-403f-b00b-17524a80b500/console-operator/1.log" Mar 18 16:46:54.367083 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:54.366996 2536 scope.go:117] "RemoveContainer" containerID="499d529c823274a1c09941c2964e22f458c3917cf0c8bc717732ced3875663b4" Mar 18 16:46:54.367162 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:46:54.367144 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-76b8565867-mx9sz_openshift-console-operator(95e73a3f-8a85-403f-b00b-17524a80b500)\"" pod="openshift-console-operator/console-operator-76b8565867-mx9sz" podUID="95e73a3f-8a85-403f-b00b-17524a80b500" Mar 18 16:46:55.250817 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:55.250789 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9fw2f_37693aba-cef0-4f5a-a523-bd82dbff0143/dns-node-resolver/0.log" Mar 18 16:46:56.251354 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:56.251331 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jnvc6_a421b67e-f253-48e4-b5a3-4a895c3bf6d2/node-ca/0.log" Mar 18 16:46:57.107099 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:57.107065 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2aea10b4-ea4d-46ca-a2c5-159a563fc276-service-ca-bundle\") pod \"router-default-559fbc86fb-nz727\" (UID: \"2aea10b4-ea4d-46ca-a2c5-159a563fc276\") " pod="openshift-ingress/router-default-559fbc86fb-nz727" Mar 18 16:46:57.107360 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:57.107107 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2aea10b4-ea4d-46ca-a2c5-159a563fc276-metrics-certs\") pod \"router-default-559fbc86fb-nz727\" (UID: \"2aea10b4-ea4d-46ca-a2c5-159a563fc276\") " pod="openshift-ingress/router-default-559fbc86fb-nz727" Mar 18 16:46:57.107360 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:46:57.107197 2536 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Mar 18 16:46:57.107360 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:46:57.107236 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2aea10b4-ea4d-46ca-a2c5-159a563fc276-service-ca-bundle podName:2aea10b4-ea4d-46ca-a2c5-159a563fc276 nodeName:}" failed. No retries permitted until 2026-03-18 16:47:05.107218333 +0000 UTC m=+147.880792896 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/2aea10b4-ea4d-46ca-a2c5-159a563fc276-service-ca-bundle") pod "router-default-559fbc86fb-nz727" (UID: "2aea10b4-ea4d-46ca-a2c5-159a563fc276") : configmap references non-existent config key: service-ca.crt Mar 18 16:46:57.107360 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:46:57.107260 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2aea10b4-ea4d-46ca-a2c5-159a563fc276-metrics-certs podName:2aea10b4-ea4d-46ca-a2c5-159a563fc276 nodeName:}" failed. No retries permitted until 2026-03-18 16:47:05.107252617 +0000 UTC m=+147.880827181 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2aea10b4-ea4d-46ca-a2c5-159a563fc276-metrics-certs") pod "router-default-559fbc86fb-nz727" (UID: "2aea10b4-ea4d-46ca-a2c5-159a563fc276") : secret "router-metrics-certs-default" not found Mar 18 16:46:59.666692 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:59.666660 2536 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-76b8565867-mx9sz" Mar 18 16:46:59.666692 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:59.666690 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-76b8565867-mx9sz" Mar 18 16:46:59.667087 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:46:59.666995 2536 scope.go:117] "RemoveContainer" containerID="499d529c823274a1c09941c2964e22f458c3917cf0c8bc717732ced3875663b4" Mar 18 16:46:59.667153 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:46:59.667137 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-76b8565867-mx9sz_openshift-console-operator(95e73a3f-8a85-403f-b00b-17524a80b500)\"" pod="openshift-console-operator/console-operator-76b8565867-mx9sz" podUID="95e73a3f-8a85-403f-b00b-17524a80b500" Mar 18 16:47:05.172113 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:05.172075 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2aea10b4-ea4d-46ca-a2c5-159a563fc276-service-ca-bundle\") pod \"router-default-559fbc86fb-nz727\" (UID: \"2aea10b4-ea4d-46ca-a2c5-159a563fc276\") " pod="openshift-ingress/router-default-559fbc86fb-nz727" Mar 18 16:47:05.172587 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:05.172126 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2aea10b4-ea4d-46ca-a2c5-159a563fc276-metrics-certs\") pod \"router-default-559fbc86fb-nz727\" (UID: \"2aea10b4-ea4d-46ca-a2c5-159a563fc276\") " pod="openshift-ingress/router-default-559fbc86fb-nz727" Mar 18 16:47:05.172657 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:05.172635 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2aea10b4-ea4d-46ca-a2c5-159a563fc276-service-ca-bundle\") pod \"router-default-559fbc86fb-nz727\" (UID: \"2aea10b4-ea4d-46ca-a2c5-159a563fc276\") " pod="openshift-ingress/router-default-559fbc86fb-nz727" Mar 18 16:47:05.174374 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:05.174352 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2aea10b4-ea4d-46ca-a2c5-159a563fc276-metrics-certs\") pod \"router-default-559fbc86fb-nz727\" (UID: \"2aea10b4-ea4d-46ca-a2c5-159a563fc276\") " pod="openshift-ingress/router-default-559fbc86fb-nz727" Mar 18 16:47:05.271741 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:05.271705 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-559fbc86fb-nz727" Mar 18 16:47:05.386680 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:05.386641 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-559fbc86fb-nz727"] Mar 18 16:47:05.389667 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:47:05.389641 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2aea10b4_ea4d_46ca_a2c5_159a563fc276.slice/crio-66335b268fe6da4fdf88985fb38175570f004b476af56c36ced4a9e2f420034c WatchSource:0}: Error finding container 66335b268fe6da4fdf88985fb38175570f004b476af56c36ced4a9e2f420034c: Status 404 returned error can't find the container with id 66335b268fe6da4fdf88985fb38175570f004b476af56c36ced4a9e2f420034c Mar 18 16:47:06.391050 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:06.391014 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-559fbc86fb-nz727" event={"ID":"2aea10b4-ea4d-46ca-a2c5-159a563fc276","Type":"ContainerStarted","Data":"37190988f0944390e2eb24b1e585cf94e3d1bcb9c3eaa4253e26f45857bc2736"} Mar 18 16:47:06.391420 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:06.391056 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-559fbc86fb-nz727" event={"ID":"2aea10b4-ea4d-46ca-a2c5-159a563fc276","Type":"ContainerStarted","Data":"66335b268fe6da4fdf88985fb38175570f004b476af56c36ced4a9e2f420034c"} Mar 18 16:47:06.409070 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:06.409028 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-559fbc86fb-nz727" podStartSLOduration=17.409014914 podStartE2EDuration="17.409014914s" podCreationTimestamp="2026-03-18 16:46:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:47:06.408389418 +0000 UTC m=+149.181964023" watchObservedRunningTime="2026-03-18 16:47:06.409014914 +0000 UTC m=+149.182589523" Mar 18 16:47:07.272698 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:07.272654 2536 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-559fbc86fb-nz727" Mar 18 16:47:07.275154 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:07.275131 2536 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-559fbc86fb-nz727" Mar 18 16:47:07.393494 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:07.393464 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-559fbc86fb-nz727" Mar 18 16:47:07.394607 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:07.394589 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-559fbc86fb-nz727" Mar 18 16:47:12.082263 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:47:12.082221 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-x862v" podUID="0f569b4c-304c-4347-827a-116204073ddf" Mar 18 16:47:12.096417 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:47:12.096388 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-l45t6" podUID="c40b2ed4-792d-4afc-bc1a-3aad44ac26e0" Mar 18 16:47:12.404773 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:12.404696 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-l45t6" Mar 18 16:47:12.404898 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:12.404696 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-x862v" Mar 18 16:47:12.930772 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:47:12.930718 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-rjx6m" podUID="275a2fa6-277f-40dc-a2bc-749a97550e2e" Mar 18 16:47:13.906534 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:13.906506 2536 scope.go:117] "RemoveContainer" containerID="499d529c823274a1c09941c2964e22f458c3917cf0c8bc717732ced3875663b4" Mar 18 16:47:14.410360 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:14.410335 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-mx9sz_95e73a3f-8a85-403f-b00b-17524a80b500/console-operator/2.log" Mar 18 16:47:14.410685 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:14.410667 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-mx9sz_95e73a3f-8a85-403f-b00b-17524a80b500/console-operator/1.log" Mar 18 16:47:14.410783 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:14.410705 2536 generic.go:358] "Generic (PLEG): container finished" podID="95e73a3f-8a85-403f-b00b-17524a80b500" containerID="2eecbd7aad2517e638c2f2ffa41628d27302acd99633acfdb5892c6e1ffa75fd" exitCode=255 Mar 18 16:47:14.410783 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:14.410757 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-76b8565867-mx9sz" event={"ID":"95e73a3f-8a85-403f-b00b-17524a80b500","Type":"ContainerDied","Data":"2eecbd7aad2517e638c2f2ffa41628d27302acd99633acfdb5892c6e1ffa75fd"} Mar 18 16:47:14.410890 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:14.410795 2536 scope.go:117] "RemoveContainer" containerID="499d529c823274a1c09941c2964e22f458c3917cf0c8bc717732ced3875663b4" Mar 18 16:47:14.411121 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:14.411098 2536 scope.go:117] "RemoveContainer" containerID="2eecbd7aad2517e638c2f2ffa41628d27302acd99633acfdb5892c6e1ffa75fd" Mar 18 16:47:14.411289 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:47:14.411255 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-76b8565867-mx9sz_openshift-console-operator(95e73a3f-8a85-403f-b00b-17524a80b500)\"" pod="openshift-console-operator/console-operator-76b8565867-mx9sz" podUID="95e73a3f-8a85-403f-b00b-17524a80b500" Mar 18 16:47:15.415444 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:15.415414 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-mx9sz_95e73a3f-8a85-403f-b00b-17524a80b500/console-operator/2.log" Mar 18 16:47:17.050317 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:17.050264 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f569b4c-304c-4347-827a-116204073ddf-cert\") pod \"ingress-canary-x862v\" (UID: \"0f569b4c-304c-4347-827a-116204073ddf\") " pod="openshift-ingress-canary/ingress-canary-x862v" Mar 18 16:47:17.050683 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:17.050332 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c40b2ed4-792d-4afc-bc1a-3aad44ac26e0-metrics-tls\") pod \"dns-default-l45t6\" (UID: \"c40b2ed4-792d-4afc-bc1a-3aad44ac26e0\") " pod="openshift-dns/dns-default-l45t6" Mar 18 16:47:17.052686 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:17.052649 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c40b2ed4-792d-4afc-bc1a-3aad44ac26e0-metrics-tls\") pod \"dns-default-l45t6\" (UID: \"c40b2ed4-792d-4afc-bc1a-3aad44ac26e0\") " pod="openshift-dns/dns-default-l45t6" Mar 18 16:47:17.052817 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:17.052733 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f569b4c-304c-4347-827a-116204073ddf-cert\") pod \"ingress-canary-x862v\" (UID: \"0f569b4c-304c-4347-827a-116204073ddf\") " pod="openshift-ingress-canary/ingress-canary-x862v" Mar 18 16:47:17.207539 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:17.207514 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wnp77\"" Mar 18 16:47:17.207932 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:17.207916 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dz9s5\"" Mar 18 16:47:17.215615 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:17.215593 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-l45t6" Mar 18 16:47:17.215687 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:17.215671 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-x862v" Mar 18 16:47:17.343466 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:17.343440 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-l45t6"] Mar 18 16:47:17.346384 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:47:17.346352 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc40b2ed4_792d_4afc_bc1a_3aad44ac26e0.slice/crio-7cc7e83911238891a01bebf0d0079c781d1bfa95b4f651ff594eeec4f6edb909 WatchSource:0}: Error finding container 7cc7e83911238891a01bebf0d0079c781d1bfa95b4f651ff594eeec4f6edb909: Status 404 returned error can't find the container with id 7cc7e83911238891a01bebf0d0079c781d1bfa95b4f651ff594eeec4f6edb909 Mar 18 16:47:17.358240 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:17.358219 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-x862v"] Mar 18 16:47:17.360558 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:47:17.360536 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f569b4c_304c_4347_827a_116204073ddf.slice/crio-da32df1e98c6cd9d5612044f986edd7e92436a0c5f5d99e9be298fedf82fa0c0 WatchSource:0}: Error finding container da32df1e98c6cd9d5612044f986edd7e92436a0c5f5d99e9be298fedf82fa0c0: Status 404 returned error can't find the container with id da32df1e98c6cd9d5612044f986edd7e92436a0c5f5d99e9be298fedf82fa0c0 Mar 18 16:47:17.422384 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:17.422353 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-l45t6" event={"ID":"c40b2ed4-792d-4afc-bc1a-3aad44ac26e0","Type":"ContainerStarted","Data":"7cc7e83911238891a01bebf0d0079c781d1bfa95b4f651ff594eeec4f6edb909"} Mar 18 16:47:17.423281 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:17.423247 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-x862v" event={"ID":"0f569b4c-304c-4347-827a-116204073ddf","Type":"ContainerStarted","Data":"da32df1e98c6cd9d5612044f986edd7e92436a0c5f5d99e9be298fedf82fa0c0"} Mar 18 16:47:19.224914 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.224882 2536 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-55b77584bb-j76sf"] Mar 18 16:47:19.228000 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.227971 2536 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-s7d2l"] Mar 18 16:47:19.228124 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.228104 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-55b77584bb-j76sf" Mar 18 16:47:19.230746 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.230718 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Mar 18 16:47:19.230854 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.230832 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-79xdw\"" Mar 18 16:47:19.231306 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.231255 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-s7d2l" Mar 18 16:47:19.233578 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.232994 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Mar 18 16:47:19.234723 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.234643 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Mar 18 16:47:19.234822 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.234774 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-fhwjh\"" Mar 18 16:47:19.241874 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.241852 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Mar 18 16:47:19.242353 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.242156 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Mar 18 16:47:19.262532 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.262503 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-55b77584bb-j76sf"] Mar 18 16:47:19.263157 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.263133 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Mar 18 16:47:19.312700 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.312670 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-s7d2l"] Mar 18 16:47:19.369207 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.369186 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c059e498-a993-46a1-8de3-8ae39045a1e7-networking-console-plugin-cert\") pod \"networking-console-plugin-55b77584bb-j76sf\" (UID: \"c059e498-a993-46a1-8de3-8ae39045a1e7\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-j76sf" Mar 18 16:47:19.369346 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.369225 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9da589e5-8a15-4bfd-8947-4f0291b208b2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-s7d2l\" (UID: \"9da589e5-8a15-4bfd-8947-4f0291b208b2\") " pod="openshift-insights/insights-runtime-extractor-s7d2l" Mar 18 16:47:19.369346 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.369251 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c059e498-a993-46a1-8de3-8ae39045a1e7-nginx-conf\") pod \"networking-console-plugin-55b77584bb-j76sf\" (UID: \"c059e498-a993-46a1-8de3-8ae39045a1e7\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-j76sf" Mar 18 16:47:19.369346 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.369309 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9da589e5-8a15-4bfd-8947-4f0291b208b2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-s7d2l\" (UID: \"9da589e5-8a15-4bfd-8947-4f0291b208b2\") " pod="openshift-insights/insights-runtime-extractor-s7d2l" Mar 18 16:47:19.369549 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.369386 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zllmh\" (UniqueName: \"kubernetes.io/projected/9da589e5-8a15-4bfd-8947-4f0291b208b2-kube-api-access-zllmh\") pod \"insights-runtime-extractor-s7d2l\" (UID: \"9da589e5-8a15-4bfd-8947-4f0291b208b2\") " pod="openshift-insights/insights-runtime-extractor-s7d2l" Mar 18 16:47:19.369549 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.369436 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9da589e5-8a15-4bfd-8947-4f0291b208b2-data-volume\") pod \"insights-runtime-extractor-s7d2l\" (UID: \"9da589e5-8a15-4bfd-8947-4f0291b208b2\") " pod="openshift-insights/insights-runtime-extractor-s7d2l" Mar 18 16:47:19.369549 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.369490 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9da589e5-8a15-4bfd-8947-4f0291b208b2-crio-socket\") pod \"insights-runtime-extractor-s7d2l\" (UID: \"9da589e5-8a15-4bfd-8947-4f0291b208b2\") " pod="openshift-insights/insights-runtime-extractor-s7d2l" Mar 18 16:47:19.388419 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.388395 2536 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-8598bb85b4-b5rkb"] Mar 18 16:47:19.391461 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.391442 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8598bb85b4-b5rkb" Mar 18 16:47:19.406562 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.406534 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Mar 18 16:47:19.406909 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.406893 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Mar 18 16:47:19.407191 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.407178 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Mar 18 16:47:19.408397 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.408374 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-8k48n\"" Mar 18 16:47:19.408978 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.408963 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Mar 18 16:47:19.420236 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.420209 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8598bb85b4-b5rkb"] Mar 18 16:47:19.430246 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.430225 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-l45t6" event={"ID":"c40b2ed4-792d-4afc-bc1a-3aad44ac26e0","Type":"ContainerStarted","Data":"01ea431963b62de5e2d210d347452b5e4f986af1d53d9214a7be4389a24bf748"} Mar 18 16:47:19.430344 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.430254 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-l45t6" event={"ID":"c40b2ed4-792d-4afc-bc1a-3aad44ac26e0","Type":"ContainerStarted","Data":"f168f01d5ba6515602314223b02d78adbe916ad61997851e6177e752105a60fc"} Mar 18 16:47:19.431370 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.431340 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-x862v" event={"ID":"0f569b4c-304c-4347-827a-116204073ddf","Type":"ContainerStarted","Data":"14ffd6c42fb22ad2114913fb273924316d12c90fd10c40e832fc0b6f3ae41d7f"} Mar 18 16:47:19.470683 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.470644 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e5f11bd2-354b-4f12-b187-4eef4f830794-registry-certificates\") pod \"image-registry-8598bb85b4-b5rkb\" (UID: \"e5f11bd2-354b-4f12-b187-4eef4f830794\") " pod="openshift-image-registry/image-registry-8598bb85b4-b5rkb" Mar 18 16:47:19.470791 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.470698 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9da589e5-8a15-4bfd-8947-4f0291b208b2-crio-socket\") pod \"insights-runtime-extractor-s7d2l\" (UID: \"9da589e5-8a15-4bfd-8947-4f0291b208b2\") " pod="openshift-insights/insights-runtime-extractor-s7d2l" Mar 18 16:47:19.470791 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.470728 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e5f11bd2-354b-4f12-b187-4eef4f830794-installation-pull-secrets\") pod \"image-registry-8598bb85b4-b5rkb\" (UID: \"e5f11bd2-354b-4f12-b187-4eef4f830794\") " pod="openshift-image-registry/image-registry-8598bb85b4-b5rkb" Mar 18 16:47:19.470791 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.470757 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c059e498-a993-46a1-8de3-8ae39045a1e7-networking-console-plugin-cert\") pod \"networking-console-plugin-55b77584bb-j76sf\" (UID: \"c059e498-a993-46a1-8de3-8ae39045a1e7\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-j76sf" Mar 18 16:47:19.470791 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.470768 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9da589e5-8a15-4bfd-8947-4f0291b208b2-crio-socket\") pod \"insights-runtime-extractor-s7d2l\" (UID: \"9da589e5-8a15-4bfd-8947-4f0291b208b2\") " pod="openshift-insights/insights-runtime-extractor-s7d2l" Mar 18 16:47:19.470979 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.470796 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l98zw\" (UniqueName: \"kubernetes.io/projected/e5f11bd2-354b-4f12-b187-4eef4f830794-kube-api-access-l98zw\") pod \"image-registry-8598bb85b4-b5rkb\" (UID: \"e5f11bd2-354b-4f12-b187-4eef4f830794\") " pod="openshift-image-registry/image-registry-8598bb85b4-b5rkb" Mar 18 16:47:19.470979 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.470832 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zllmh\" (UniqueName: \"kubernetes.io/projected/9da589e5-8a15-4bfd-8947-4f0291b208b2-kube-api-access-zllmh\") pod \"insights-runtime-extractor-s7d2l\" (UID: \"9da589e5-8a15-4bfd-8947-4f0291b208b2\") " pod="openshift-insights/insights-runtime-extractor-s7d2l" Mar 18 16:47:19.470979 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.470861 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e5f11bd2-354b-4f12-b187-4eef4f830794-trusted-ca\") pod \"image-registry-8598bb85b4-b5rkb\" (UID: \"e5f11bd2-354b-4f12-b187-4eef4f830794\") " pod="openshift-image-registry/image-registry-8598bb85b4-b5rkb" Mar 18 16:47:19.470979 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.470891 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9da589e5-8a15-4bfd-8947-4f0291b208b2-data-volume\") pod \"insights-runtime-extractor-s7d2l\" (UID: \"9da589e5-8a15-4bfd-8947-4f0291b208b2\") " pod="openshift-insights/insights-runtime-extractor-s7d2l" Mar 18 16:47:19.470979 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.470920 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e5f11bd2-354b-4f12-b187-4eef4f830794-image-registry-private-configuration\") pod \"image-registry-8598bb85b4-b5rkb\" (UID: \"e5f11bd2-354b-4f12-b187-4eef4f830794\") " pod="openshift-image-registry/image-registry-8598bb85b4-b5rkb" Mar 18 16:47:19.471152 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.471069 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e5f11bd2-354b-4f12-b187-4eef4f830794-ca-trust-extracted\") pod \"image-registry-8598bb85b4-b5rkb\" (UID: \"e5f11bd2-354b-4f12-b187-4eef4f830794\") " pod="openshift-image-registry/image-registry-8598bb85b4-b5rkb" Mar 18 16:47:19.471152 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.471103 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9da589e5-8a15-4bfd-8947-4f0291b208b2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-s7d2l\" (UID: \"9da589e5-8a15-4bfd-8947-4f0291b208b2\") " pod="openshift-insights/insights-runtime-extractor-s7d2l" Mar 18 16:47:19.471152 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.471135 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c059e498-a993-46a1-8de3-8ae39045a1e7-nginx-conf\") pod \"networking-console-plugin-55b77584bb-j76sf\" (UID: \"c059e498-a993-46a1-8de3-8ae39045a1e7\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-j76sf" Mar 18 16:47:19.471298 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.471169 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e5f11bd2-354b-4f12-b187-4eef4f830794-bound-sa-token\") pod \"image-registry-8598bb85b4-b5rkb\" (UID: \"e5f11bd2-354b-4f12-b187-4eef4f830794\") " pod="openshift-image-registry/image-registry-8598bb85b4-b5rkb" Mar 18 16:47:19.471298 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.471177 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9da589e5-8a15-4bfd-8947-4f0291b208b2-data-volume\") pod \"insights-runtime-extractor-s7d2l\" (UID: \"9da589e5-8a15-4bfd-8947-4f0291b208b2\") " pod="openshift-insights/insights-runtime-extractor-s7d2l" Mar 18 16:47:19.471298 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.471201 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9da589e5-8a15-4bfd-8947-4f0291b208b2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-s7d2l\" (UID: \"9da589e5-8a15-4bfd-8947-4f0291b208b2\") " pod="openshift-insights/insights-runtime-extractor-s7d2l" Mar 18 16:47:19.471298 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.471251 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e5f11bd2-354b-4f12-b187-4eef4f830794-registry-tls\") pod \"image-registry-8598bb85b4-b5rkb\" (UID: \"e5f11bd2-354b-4f12-b187-4eef4f830794\") " pod="openshift-image-registry/image-registry-8598bb85b4-b5rkb" Mar 18 16:47:19.471560 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.471542 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9da589e5-8a15-4bfd-8947-4f0291b208b2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-s7d2l\" (UID: \"9da589e5-8a15-4bfd-8947-4f0291b208b2\") " pod="openshift-insights/insights-runtime-extractor-s7d2l" Mar 18 16:47:19.471984 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.471964 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c059e498-a993-46a1-8de3-8ae39045a1e7-nginx-conf\") pod \"networking-console-plugin-55b77584bb-j76sf\" (UID: \"c059e498-a993-46a1-8de3-8ae39045a1e7\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-j76sf" Mar 18 16:47:19.473382 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.473361 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9da589e5-8a15-4bfd-8947-4f0291b208b2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-s7d2l\" (UID: \"9da589e5-8a15-4bfd-8947-4f0291b208b2\") " pod="openshift-insights/insights-runtime-extractor-s7d2l" Mar 18 16:47:19.473450 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.473415 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c059e498-a993-46a1-8de3-8ae39045a1e7-networking-console-plugin-cert\") pod \"networking-console-plugin-55b77584bb-j76sf\" (UID: \"c059e498-a993-46a1-8de3-8ae39045a1e7\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-j76sf" Mar 18 16:47:19.496155 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.496130 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zllmh\" (UniqueName: \"kubernetes.io/projected/9da589e5-8a15-4bfd-8947-4f0291b208b2-kube-api-access-zllmh\") pod \"insights-runtime-extractor-s7d2l\" (UID: \"9da589e5-8a15-4bfd-8947-4f0291b208b2\") " pod="openshift-insights/insights-runtime-extractor-s7d2l" Mar 18 16:47:19.506523 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.506487 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-x862v" podStartSLOduration=128.713637342 podStartE2EDuration="2m10.506475204s" podCreationTimestamp="2026-03-18 16:45:09 +0000 UTC" firstStartedPulling="2026-03-18 16:47:17.362152198 +0000 UTC m=+160.135726767" lastFinishedPulling="2026-03-18 16:47:19.154990065 +0000 UTC m=+161.928564629" observedRunningTime="2026-03-18 16:47:19.506025278 +0000 UTC m=+162.279599866" watchObservedRunningTime="2026-03-18 16:47:19.506475204 +0000 UTC m=+162.280049790" Mar 18 16:47:19.542871 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.542850 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-55b77584bb-j76sf" Mar 18 16:47:19.561894 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.561866 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-s7d2l" Mar 18 16:47:19.571773 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.571746 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e5f11bd2-354b-4f12-b187-4eef4f830794-installation-pull-secrets\") pod \"image-registry-8598bb85b4-b5rkb\" (UID: \"e5f11bd2-354b-4f12-b187-4eef4f830794\") " pod="openshift-image-registry/image-registry-8598bb85b4-b5rkb" Mar 18 16:47:19.571938 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.571865 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l98zw\" (UniqueName: \"kubernetes.io/projected/e5f11bd2-354b-4f12-b187-4eef4f830794-kube-api-access-l98zw\") pod \"image-registry-8598bb85b4-b5rkb\" (UID: \"e5f11bd2-354b-4f12-b187-4eef4f830794\") " pod="openshift-image-registry/image-registry-8598bb85b4-b5rkb" Mar 18 16:47:19.571938 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.571915 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e5f11bd2-354b-4f12-b187-4eef4f830794-trusted-ca\") pod \"image-registry-8598bb85b4-b5rkb\" (UID: \"e5f11bd2-354b-4f12-b187-4eef4f830794\") " pod="openshift-image-registry/image-registry-8598bb85b4-b5rkb" Mar 18 16:47:19.572076 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.571949 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e5f11bd2-354b-4f12-b187-4eef4f830794-image-registry-private-configuration\") pod \"image-registry-8598bb85b4-b5rkb\" (UID: \"e5f11bd2-354b-4f12-b187-4eef4f830794\") " pod="openshift-image-registry/image-registry-8598bb85b4-b5rkb" Mar 18 16:47:19.572076 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.571994 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e5f11bd2-354b-4f12-b187-4eef4f830794-ca-trust-extracted\") pod \"image-registry-8598bb85b4-b5rkb\" (UID: \"e5f11bd2-354b-4f12-b187-4eef4f830794\") " pod="openshift-image-registry/image-registry-8598bb85b4-b5rkb" Mar 18 16:47:19.572521 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.572421 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e5f11bd2-354b-4f12-b187-4eef4f830794-ca-trust-extracted\") pod \"image-registry-8598bb85b4-b5rkb\" (UID: \"e5f11bd2-354b-4f12-b187-4eef4f830794\") " pod="openshift-image-registry/image-registry-8598bb85b4-b5rkb" Mar 18 16:47:19.572627 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.572520 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e5f11bd2-354b-4f12-b187-4eef4f830794-bound-sa-token\") pod \"image-registry-8598bb85b4-b5rkb\" (UID: \"e5f11bd2-354b-4f12-b187-4eef4f830794\") " pod="openshift-image-registry/image-registry-8598bb85b4-b5rkb" Mar 18 16:47:19.572627 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.572569 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e5f11bd2-354b-4f12-b187-4eef4f830794-registry-tls\") pod \"image-registry-8598bb85b4-b5rkb\" (UID: \"e5f11bd2-354b-4f12-b187-4eef4f830794\") " pod="openshift-image-registry/image-registry-8598bb85b4-b5rkb" Mar 18 16:47:19.572627 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.572608 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e5f11bd2-354b-4f12-b187-4eef4f830794-registry-certificates\") pod \"image-registry-8598bb85b4-b5rkb\" (UID: \"e5f11bd2-354b-4f12-b187-4eef4f830794\") " pod="openshift-image-registry/image-registry-8598bb85b4-b5rkb" Mar 18 16:47:19.573781 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.573740 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e5f11bd2-354b-4f12-b187-4eef4f830794-trusted-ca\") pod \"image-registry-8598bb85b4-b5rkb\" (UID: \"e5f11bd2-354b-4f12-b187-4eef4f830794\") " pod="openshift-image-registry/image-registry-8598bb85b4-b5rkb" Mar 18 16:47:19.573883 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.573742 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e5f11bd2-354b-4f12-b187-4eef4f830794-registry-certificates\") pod \"image-registry-8598bb85b4-b5rkb\" (UID: \"e5f11bd2-354b-4f12-b187-4eef4f830794\") " pod="openshift-image-registry/image-registry-8598bb85b4-b5rkb" Mar 18 16:47:19.574689 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.574658 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e5f11bd2-354b-4f12-b187-4eef4f830794-installation-pull-secrets\") pod \"image-registry-8598bb85b4-b5rkb\" (UID: \"e5f11bd2-354b-4f12-b187-4eef4f830794\") " pod="openshift-image-registry/image-registry-8598bb85b4-b5rkb" Mar 18 16:47:19.575135 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.575113 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e5f11bd2-354b-4f12-b187-4eef4f830794-image-registry-private-configuration\") pod \"image-registry-8598bb85b4-b5rkb\" (UID: \"e5f11bd2-354b-4f12-b187-4eef4f830794\") " pod="openshift-image-registry/image-registry-8598bb85b4-b5rkb" Mar 18 16:47:19.575653 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.575635 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e5f11bd2-354b-4f12-b187-4eef4f830794-registry-tls\") pod \"image-registry-8598bb85b4-b5rkb\" (UID: \"e5f11bd2-354b-4f12-b187-4eef4f830794\") " pod="openshift-image-registry/image-registry-8598bb85b4-b5rkb" Mar 18 16:47:19.587055 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.587036 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e5f11bd2-354b-4f12-b187-4eef4f830794-bound-sa-token\") pod \"image-registry-8598bb85b4-b5rkb\" (UID: \"e5f11bd2-354b-4f12-b187-4eef4f830794\") " pod="openshift-image-registry/image-registry-8598bb85b4-b5rkb" Mar 18 16:47:19.597047 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.596991 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l98zw\" (UniqueName: \"kubernetes.io/projected/e5f11bd2-354b-4f12-b187-4eef4f830794-kube-api-access-l98zw\") pod \"image-registry-8598bb85b4-b5rkb\" (UID: \"e5f11bd2-354b-4f12-b187-4eef4f830794\") " pod="openshift-image-registry/image-registry-8598bb85b4-b5rkb" Mar 18 16:47:19.665907 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.665881 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-76b8565867-mx9sz" Mar 18 16:47:19.666067 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.665920 2536 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-76b8565867-mx9sz" Mar 18 16:47:19.666254 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.666242 2536 scope.go:117] "RemoveContainer" containerID="2eecbd7aad2517e638c2f2ffa41628d27302acd99633acfdb5892c6e1ffa75fd" Mar 18 16:47:19.666444 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:47:19.666427 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-76b8565867-mx9sz_openshift-console-operator(95e73a3f-8a85-403f-b00b-17524a80b500)\"" pod="openshift-console-operator/console-operator-76b8565867-mx9sz" podUID="95e73a3f-8a85-403f-b00b-17524a80b500" Mar 18 16:47:19.683466 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.683427 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-l45t6" podStartSLOduration=128.919452788 podStartE2EDuration="2m10.683414533s" podCreationTimestamp="2026-03-18 16:45:09 +0000 UTC" firstStartedPulling="2026-03-18 16:47:17.348319454 +0000 UTC m=+160.121894018" lastFinishedPulling="2026-03-18 16:47:19.112281186 +0000 UTC m=+161.885855763" observedRunningTime="2026-03-18 16:47:19.54381088 +0000 UTC m=+162.317385467" watchObservedRunningTime="2026-03-18 16:47:19.683414533 +0000 UTC m=+162.456989098" Mar 18 16:47:19.684294 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.684278 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-55b77584bb-j76sf"] Mar 18 16:47:19.686169 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:47:19.686144 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc059e498_a993_46a1_8de3_8ae39045a1e7.slice/crio-e3b5a0ad180a327b3bd8c0115f51684518c166d0205d03e218253c232c42e60f WatchSource:0}: Error finding container e3b5a0ad180a327b3bd8c0115f51684518c166d0205d03e218253c232c42e60f: Status 404 returned error can't find the container with id e3b5a0ad180a327b3bd8c0115f51684518c166d0205d03e218253c232c42e60f Mar 18 16:47:19.702932 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.702909 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8598bb85b4-b5rkb" Mar 18 16:47:19.714904 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:47:19.714882 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9da589e5_8a15_4bfd_8947_4f0291b208b2.slice/crio-2dfc098dbca1fa8e4c3922ad84a4452a611b58555f6a5f388dd3832e029c7603 WatchSource:0}: Error finding container 2dfc098dbca1fa8e4c3922ad84a4452a611b58555f6a5f388dd3832e029c7603: Status 404 returned error can't find the container with id 2dfc098dbca1fa8e4c3922ad84a4452a611b58555f6a5f388dd3832e029c7603 Mar 18 16:47:19.718294 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.718255 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-s7d2l"] Mar 18 16:47:19.873475 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:19.873378 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8598bb85b4-b5rkb"] Mar 18 16:47:19.875714 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:47:19.875688 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5f11bd2_354b_4f12_b187_4eef4f830794.slice/crio-8d1a636adebfce71b95ff28c1ce22eb298a163e8dcec5eee76b6ad6e9fa0e9d3 WatchSource:0}: Error finding container 8d1a636adebfce71b95ff28c1ce22eb298a163e8dcec5eee76b6ad6e9fa0e9d3: Status 404 returned error can't find the container with id 8d1a636adebfce71b95ff28c1ce22eb298a163e8dcec5eee76b6ad6e9fa0e9d3 Mar 18 16:47:20.436041 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:20.436005 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8598bb85b4-b5rkb" event={"ID":"e5f11bd2-354b-4f12-b187-4eef4f830794","Type":"ContainerStarted","Data":"3192380907b4b5626f68044e1c13127c2d02e12aaa54921d2b74129010c3f057"} Mar 18 16:47:20.436471 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:20.436050 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8598bb85b4-b5rkb" event={"ID":"e5f11bd2-354b-4f12-b187-4eef4f830794","Type":"ContainerStarted","Data":"8d1a636adebfce71b95ff28c1ce22eb298a163e8dcec5eee76b6ad6e9fa0e9d3"} Mar 18 16:47:20.436471 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:20.436096 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-8598bb85b4-b5rkb" Mar 18 16:47:20.437211 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:20.437187 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-55b77584bb-j76sf" event={"ID":"c059e498-a993-46a1-8de3-8ae39045a1e7","Type":"ContainerStarted","Data":"e3b5a0ad180a327b3bd8c0115f51684518c166d0205d03e218253c232c42e60f"} Mar 18 16:47:20.438980 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:20.438953 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-s7d2l" event={"ID":"9da589e5-8a15-4bfd-8947-4f0291b208b2","Type":"ContainerStarted","Data":"8884367b9f5d493bc29cca3c6e9bb4c5ee17ce808a6ed3b0724a274643f927c4"} Mar 18 16:47:20.439094 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:20.438987 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-s7d2l" event={"ID":"9da589e5-8a15-4bfd-8947-4f0291b208b2","Type":"ContainerStarted","Data":"62fac8fc30faaba81815050907bcd3b68c13bdb83f94e897ae7e313b6e54cb41"} Mar 18 16:47:20.439094 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:20.439002 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-s7d2l" event={"ID":"9da589e5-8a15-4bfd-8947-4f0291b208b2","Type":"ContainerStarted","Data":"2dfc098dbca1fa8e4c3922ad84a4452a611b58555f6a5f388dd3832e029c7603"} Mar 18 16:47:20.439221 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:20.439204 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-l45t6" Mar 18 16:47:21.443931 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:21.443890 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-55b77584bb-j76sf" event={"ID":"c059e498-a993-46a1-8de3-8ae39045a1e7","Type":"ContainerStarted","Data":"59fcafc3628a11fefd0977e19e5ebdc4092fdc8866761b22faf3288d76ecbe5e"} Mar 18 16:47:21.465664 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:21.465618 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-8598bb85b4-b5rkb" podStartSLOduration=2.465603105 podStartE2EDuration="2.465603105s" podCreationTimestamp="2026-03-18 16:47:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:47:20.492937632 +0000 UTC m=+163.266512228" watchObservedRunningTime="2026-03-18 16:47:21.465603105 +0000 UTC m=+164.239177685" Mar 18 16:47:22.448496 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:22.448415 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-s7d2l" event={"ID":"9da589e5-8a15-4bfd-8947-4f0291b208b2","Type":"ContainerStarted","Data":"4cae71bf42ce7e22cc8fcc1797447184e98c548d1f3bd6635921c9ca615e483a"} Mar 18 16:47:22.480936 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:22.480890 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-s7d2l" podStartSLOduration=1.4001634059999999 podStartE2EDuration="3.480877223s" podCreationTimestamp="2026-03-18 16:47:19 +0000 UTC" firstStartedPulling="2026-03-18 16:47:19.803468021 +0000 UTC m=+162.577042591" lastFinishedPulling="2026-03-18 16:47:21.884181845 +0000 UTC m=+164.657756408" observedRunningTime="2026-03-18 16:47:22.479985246 +0000 UTC m=+165.253559832" watchObservedRunningTime="2026-03-18 16:47:22.480877223 +0000 UTC m=+165.254451791" Mar 18 16:47:22.481391 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:22.481361 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-55b77584bb-j76sf" podStartSLOduration=2.223974255 podStartE2EDuration="3.481353503s" podCreationTimestamp="2026-03-18 16:47:19 +0000 UTC" firstStartedPulling="2026-03-18 16:47:19.689654105 +0000 UTC m=+162.463228672" lastFinishedPulling="2026-03-18 16:47:20.947033356 +0000 UTC m=+163.720607920" observedRunningTime="2026-03-18 16:47:21.467053764 +0000 UTC m=+164.240628348" watchObservedRunningTime="2026-03-18 16:47:22.481353503 +0000 UTC m=+165.254928111" Mar 18 16:47:24.906282 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:24.906230 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rjx6m" Mar 18 16:47:30.446204 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:30.446172 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-l45t6" Mar 18 16:47:31.906605 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:31.906576 2536 scope.go:117] "RemoveContainer" containerID="2eecbd7aad2517e638c2f2ffa41628d27302acd99633acfdb5892c6e1ffa75fd" Mar 18 16:47:31.906984 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:47:31.906761 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-76b8565867-mx9sz_openshift-console-operator(95e73a3f-8a85-403f-b00b-17524a80b500)\"" pod="openshift-console-operator/console-operator-76b8565867-mx9sz" podUID="95e73a3f-8a85-403f-b00b-17524a80b500" Mar 18 16:47:36.101957 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.101921 2536 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-6df7999c47-v5fw4"] Mar 18 16:47:36.106389 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.106369 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-6df7999c47-v5fw4" Mar 18 16:47:36.108771 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.108750 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Mar 18 16:47:36.108882 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.108773 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Mar 18 16:47:36.109375 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.109358 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Mar 18 16:47:36.109456 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.109373 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Mar 18 16:47:36.109512 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.109450 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Mar 18 16:47:36.109574 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.109533 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Mar 18 16:47:36.109960 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.109943 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-kkktp\"" Mar 18 16:47:36.116671 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.116649 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-6df7999c47-v5fw4"] Mar 18 16:47:36.228360 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.228331 2536 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-68b5d5d464-qmgzp"] Mar 18 16:47:36.231497 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.231481 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-qmgzp" Mar 18 16:47:36.233398 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.233371 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Mar 18 16:47:36.233514 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.233438 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Mar 18 16:47:36.233514 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.233494 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-67w6q\"" Mar 18 16:47:36.241045 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.241021 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-68b5d5d464-qmgzp"] Mar 18 16:47:36.249130 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.249112 2536 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-dqszd"] Mar 18 16:47:36.252051 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.252035 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dqszd" Mar 18 16:47:36.254022 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.253998 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Mar 18 16:47:36.254305 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.254247 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Mar 18 16:47:36.254428 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.254406 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Mar 18 16:47:36.254581 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.254565 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-g2m56\"" Mar 18 16:47:36.289888 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.289867 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/25bd2c32-7449-4f75-ab0e-7b815e14c3ca-volume-directive-shadow\") pod \"kube-state-metrics-6df7999c47-v5fw4\" (UID: \"25bd2c32-7449-4f75-ab0e-7b815e14c3ca\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-v5fw4" Mar 18 16:47:36.289990 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.289896 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/25bd2c32-7449-4f75-ab0e-7b815e14c3ca-metrics-client-ca\") pod \"kube-state-metrics-6df7999c47-v5fw4\" (UID: \"25bd2c32-7449-4f75-ab0e-7b815e14c3ca\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-v5fw4" Mar 18 16:47:36.289990 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.289916 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/25bd2c32-7449-4f75-ab0e-7b815e14c3ca-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-6df7999c47-v5fw4\" (UID: \"25bd2c32-7449-4f75-ab0e-7b815e14c3ca\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-v5fw4" Mar 18 16:47:36.289990 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.289934 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/25bd2c32-7449-4f75-ab0e-7b815e14c3ca-kube-state-metrics-tls\") pod \"kube-state-metrics-6df7999c47-v5fw4\" (UID: \"25bd2c32-7449-4f75-ab0e-7b815e14c3ca\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-v5fw4" Mar 18 16:47:36.290099 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.290015 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6fjb\" (UniqueName: \"kubernetes.io/projected/25bd2c32-7449-4f75-ab0e-7b815e14c3ca-kube-api-access-f6fjb\") pod \"kube-state-metrics-6df7999c47-v5fw4\" (UID: \"25bd2c32-7449-4f75-ab0e-7b815e14c3ca\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-v5fw4" Mar 18 16:47:36.290099 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.290076 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/25bd2c32-7449-4f75-ab0e-7b815e14c3ca-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-6df7999c47-v5fw4\" (UID: \"25bd2c32-7449-4f75-ab0e-7b815e14c3ca\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-v5fw4" Mar 18 16:47:36.391149 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.391086 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4d7aaf65-f284-4670-8d82-f69adb1a0774-sys\") pod \"node-exporter-dqszd\" (UID: \"4d7aaf65-f284-4670-8d82-f69adb1a0774\") " pod="openshift-monitoring/node-exporter-dqszd" Mar 18 16:47:36.391149 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.391128 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/25bd2c32-7449-4f75-ab0e-7b815e14c3ca-volume-directive-shadow\") pod \"kube-state-metrics-6df7999c47-v5fw4\" (UID: \"25bd2c32-7449-4f75-ab0e-7b815e14c3ca\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-v5fw4" Mar 18 16:47:36.391325 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.391159 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/25bd2c32-7449-4f75-ab0e-7b815e14c3ca-metrics-client-ca\") pod \"kube-state-metrics-6df7999c47-v5fw4\" (UID: \"25bd2c32-7449-4f75-ab0e-7b815e14c3ca\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-v5fw4" Mar 18 16:47:36.391325 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.391184 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/25bd2c32-7449-4f75-ab0e-7b815e14c3ca-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-6df7999c47-v5fw4\" (UID: \"25bd2c32-7449-4f75-ab0e-7b815e14c3ca\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-v5fw4" Mar 18 16:47:36.391325 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.391213 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/556af7ea-9e15-4fb2-b326-af5947c1713a-metrics-client-ca\") pod \"openshift-state-metrics-68b5d5d464-qmgzp\" (UID: \"556af7ea-9e15-4fb2-b326-af5947c1713a\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-qmgzp" Mar 18 16:47:36.391325 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.391239 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grr29\" (UniqueName: \"kubernetes.io/projected/556af7ea-9e15-4fb2-b326-af5947c1713a-kube-api-access-grr29\") pod \"openshift-state-metrics-68b5d5d464-qmgzp\" (UID: \"556af7ea-9e15-4fb2-b326-af5947c1713a\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-qmgzp" Mar 18 16:47:36.391472 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.391378 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/25bd2c32-7449-4f75-ab0e-7b815e14c3ca-kube-state-metrics-tls\") pod \"kube-state-metrics-6df7999c47-v5fw4\" (UID: \"25bd2c32-7449-4f75-ab0e-7b815e14c3ca\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-v5fw4" Mar 18 16:47:36.391472 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.391424 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f6fjb\" (UniqueName: \"kubernetes.io/projected/25bd2c32-7449-4f75-ab0e-7b815e14c3ca-kube-api-access-f6fjb\") pod \"kube-state-metrics-6df7999c47-v5fw4\" (UID: \"25bd2c32-7449-4f75-ab0e-7b815e14c3ca\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-v5fw4" Mar 18 16:47:36.391472 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.391464 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4d7aaf65-f284-4670-8d82-f69adb1a0774-node-exporter-textfile\") pod \"node-exporter-dqszd\" (UID: \"4d7aaf65-f284-4670-8d82-f69adb1a0774\") " pod="openshift-monitoring/node-exporter-dqszd" Mar 18 16:47:36.391602 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.391501 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4d7aaf65-f284-4670-8d82-f69adb1a0774-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dqszd\" (UID: \"4d7aaf65-f284-4670-8d82-f69adb1a0774\") " pod="openshift-monitoring/node-exporter-dqszd" Mar 18 16:47:36.391602 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:47:36.391506 2536 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Mar 18 16:47:36.391602 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.391524 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/25bd2c32-7449-4f75-ab0e-7b815e14c3ca-volume-directive-shadow\") pod \"kube-state-metrics-6df7999c47-v5fw4\" (UID: \"25bd2c32-7449-4f75-ab0e-7b815e14c3ca\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-v5fw4" Mar 18 16:47:36.391602 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.391532 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th6cl\" (UniqueName: \"kubernetes.io/projected/4d7aaf65-f284-4670-8d82-f69adb1a0774-kube-api-access-th6cl\") pod \"node-exporter-dqszd\" (UID: \"4d7aaf65-f284-4670-8d82-f69adb1a0774\") " pod="openshift-monitoring/node-exporter-dqszd" Mar 18 16:47:36.391602 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:47:36.391584 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25bd2c32-7449-4f75-ab0e-7b815e14c3ca-kube-state-metrics-tls podName:25bd2c32-7449-4f75-ab0e-7b815e14c3ca nodeName:}" failed. No retries permitted until 2026-03-18 16:47:36.891563822 +0000 UTC m=+179.665138408 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/25bd2c32-7449-4f75-ab0e-7b815e14c3ca-kube-state-metrics-tls") pod "kube-state-metrics-6df7999c47-v5fw4" (UID: "25bd2c32-7449-4f75-ab0e-7b815e14c3ca") : secret "kube-state-metrics-tls" not found Mar 18 16:47:36.391811 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.391648 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4d7aaf65-f284-4670-8d82-f69adb1a0774-root\") pod \"node-exporter-dqszd\" (UID: \"4d7aaf65-f284-4670-8d82-f69adb1a0774\") " pod="openshift-monitoring/node-exporter-dqszd" Mar 18 16:47:36.391811 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.391680 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4d7aaf65-f284-4670-8d82-f69adb1a0774-metrics-client-ca\") pod \"node-exporter-dqszd\" (UID: \"4d7aaf65-f284-4670-8d82-f69adb1a0774\") " pod="openshift-monitoring/node-exporter-dqszd" Mar 18 16:47:36.391811 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.391729 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4d7aaf65-f284-4670-8d82-f69adb1a0774-node-exporter-accelerators-collector-config\") pod \"node-exporter-dqszd\" (UID: \"4d7aaf65-f284-4670-8d82-f69adb1a0774\") " pod="openshift-monitoring/node-exporter-dqszd" Mar 18 16:47:36.391811 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.391761 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4d7aaf65-f284-4670-8d82-f69adb1a0774-node-exporter-wtmp\") pod \"node-exporter-dqszd\" (UID: \"4d7aaf65-f284-4670-8d82-f69adb1a0774\") " pod="openshift-monitoring/node-exporter-dqszd" Mar 18 16:47:36.391811 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.391796 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/25bd2c32-7449-4f75-ab0e-7b815e14c3ca-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-6df7999c47-v5fw4\" (UID: \"25bd2c32-7449-4f75-ab0e-7b815e14c3ca\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-v5fw4" Mar 18 16:47:36.391999 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.391830 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/556af7ea-9e15-4fb2-b326-af5947c1713a-openshift-state-metrics-tls\") pod \"openshift-state-metrics-68b5d5d464-qmgzp\" (UID: \"556af7ea-9e15-4fb2-b326-af5947c1713a\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-qmgzp" Mar 18 16:47:36.391999 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.391869 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4d7aaf65-f284-4670-8d82-f69adb1a0774-node-exporter-tls\") pod \"node-exporter-dqszd\" (UID: \"4d7aaf65-f284-4670-8d82-f69adb1a0774\") " pod="openshift-monitoring/node-exporter-dqszd" Mar 18 16:47:36.391999 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.391894 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/556af7ea-9e15-4fb2-b326-af5947c1713a-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-68b5d5d464-qmgzp\" (UID: \"556af7ea-9e15-4fb2-b326-af5947c1713a\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-qmgzp" Mar 18 16:47:36.391999 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.391976 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/25bd2c32-7449-4f75-ab0e-7b815e14c3ca-metrics-client-ca\") pod \"kube-state-metrics-6df7999c47-v5fw4\" (UID: \"25bd2c32-7449-4f75-ab0e-7b815e14c3ca\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-v5fw4" Mar 18 16:47:36.392348 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.392329 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/25bd2c32-7449-4f75-ab0e-7b815e14c3ca-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-6df7999c47-v5fw4\" (UID: \"25bd2c32-7449-4f75-ab0e-7b815e14c3ca\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-v5fw4" Mar 18 16:47:36.393843 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.393820 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/25bd2c32-7449-4f75-ab0e-7b815e14c3ca-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-6df7999c47-v5fw4\" (UID: \"25bd2c32-7449-4f75-ab0e-7b815e14c3ca\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-v5fw4" Mar 18 16:47:36.399227 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.399207 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6fjb\" (UniqueName: \"kubernetes.io/projected/25bd2c32-7449-4f75-ab0e-7b815e14c3ca-kube-api-access-f6fjb\") pod \"kube-state-metrics-6df7999c47-v5fw4\" (UID: \"25bd2c32-7449-4f75-ab0e-7b815e14c3ca\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-v5fw4" Mar 18 16:47:36.492554 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.492520 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4d7aaf65-f284-4670-8d82-f69adb1a0774-node-exporter-accelerators-collector-config\") pod \"node-exporter-dqszd\" (UID: \"4d7aaf65-f284-4670-8d82-f69adb1a0774\") " pod="openshift-monitoring/node-exporter-dqszd" Mar 18 16:47:36.492554 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.492553 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4d7aaf65-f284-4670-8d82-f69adb1a0774-node-exporter-wtmp\") pod \"node-exporter-dqszd\" (UID: \"4d7aaf65-f284-4670-8d82-f69adb1a0774\") " pod="openshift-monitoring/node-exporter-dqszd" Mar 18 16:47:36.492786 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.492588 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/556af7ea-9e15-4fb2-b326-af5947c1713a-openshift-state-metrics-tls\") pod \"openshift-state-metrics-68b5d5d464-qmgzp\" (UID: \"556af7ea-9e15-4fb2-b326-af5947c1713a\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-qmgzp" Mar 18 16:47:36.492786 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.492708 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4d7aaf65-f284-4670-8d82-f69adb1a0774-node-exporter-tls\") pod \"node-exporter-dqszd\" (UID: \"4d7aaf65-f284-4670-8d82-f69adb1a0774\") " pod="openshift-monitoring/node-exporter-dqszd" Mar 18 16:47:36.492786 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.492726 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4d7aaf65-f284-4670-8d82-f69adb1a0774-node-exporter-wtmp\") pod \"node-exporter-dqszd\" (UID: \"4d7aaf65-f284-4670-8d82-f69adb1a0774\") " pod="openshift-monitoring/node-exporter-dqszd" Mar 18 16:47:36.492786 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.492760 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/556af7ea-9e15-4fb2-b326-af5947c1713a-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-68b5d5d464-qmgzp\" (UID: \"556af7ea-9e15-4fb2-b326-af5947c1713a\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-qmgzp" Mar 18 16:47:36.492786 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.492784 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4d7aaf65-f284-4670-8d82-f69adb1a0774-sys\") pod \"node-exporter-dqszd\" (UID: \"4d7aaf65-f284-4670-8d82-f69adb1a0774\") " pod="openshift-monitoring/node-exporter-dqszd" Mar 18 16:47:36.493023 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.492821 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/556af7ea-9e15-4fb2-b326-af5947c1713a-metrics-client-ca\") pod \"openshift-state-metrics-68b5d5d464-qmgzp\" (UID: \"556af7ea-9e15-4fb2-b326-af5947c1713a\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-qmgzp" Mar 18 16:47:36.493023 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.492854 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grr29\" (UniqueName: \"kubernetes.io/projected/556af7ea-9e15-4fb2-b326-af5947c1713a-kube-api-access-grr29\") pod \"openshift-state-metrics-68b5d5d464-qmgzp\" (UID: \"556af7ea-9e15-4fb2-b326-af5947c1713a\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-qmgzp" Mar 18 16:47:36.493023 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.492886 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4d7aaf65-f284-4670-8d82-f69adb1a0774-sys\") pod \"node-exporter-dqszd\" (UID: \"4d7aaf65-f284-4670-8d82-f69adb1a0774\") " pod="openshift-monitoring/node-exporter-dqszd" Mar 18 16:47:36.493023 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.492924 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4d7aaf65-f284-4670-8d82-f69adb1a0774-node-exporter-textfile\") pod \"node-exporter-dqszd\" (UID: \"4d7aaf65-f284-4670-8d82-f69adb1a0774\") " pod="openshift-monitoring/node-exporter-dqszd" Mar 18 16:47:36.493023 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.492962 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4d7aaf65-f284-4670-8d82-f69adb1a0774-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dqszd\" (UID: \"4d7aaf65-f284-4670-8d82-f69adb1a0774\") " pod="openshift-monitoring/node-exporter-dqszd" Mar 18 16:47:36.493023 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.492988 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-th6cl\" (UniqueName: \"kubernetes.io/projected/4d7aaf65-f284-4670-8d82-f69adb1a0774-kube-api-access-th6cl\") pod \"node-exporter-dqszd\" (UID: \"4d7aaf65-f284-4670-8d82-f69adb1a0774\") " pod="openshift-monitoring/node-exporter-dqszd" Mar 18 16:47:36.493336 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.493031 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4d7aaf65-f284-4670-8d82-f69adb1a0774-root\") pod \"node-exporter-dqszd\" (UID: \"4d7aaf65-f284-4670-8d82-f69adb1a0774\") " pod="openshift-monitoring/node-exporter-dqszd" Mar 18 16:47:36.493336 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.493058 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4d7aaf65-f284-4670-8d82-f69adb1a0774-metrics-client-ca\") pod \"node-exporter-dqszd\" (UID: \"4d7aaf65-f284-4670-8d82-f69adb1a0774\") " pod="openshift-monitoring/node-exporter-dqszd" Mar 18 16:47:36.493336 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.493192 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4d7aaf65-f284-4670-8d82-f69adb1a0774-node-exporter-textfile\") pod \"node-exporter-dqszd\" (UID: \"4d7aaf65-f284-4670-8d82-f69adb1a0774\") " pod="openshift-monitoring/node-exporter-dqszd" Mar 18 16:47:36.493336 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.493199 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4d7aaf65-f284-4670-8d82-f69adb1a0774-node-exporter-accelerators-collector-config\") pod \"node-exporter-dqszd\" (UID: \"4d7aaf65-f284-4670-8d82-f69adb1a0774\") " pod="openshift-monitoring/node-exporter-dqszd" Mar 18 16:47:36.493523 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.493351 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4d7aaf65-f284-4670-8d82-f69adb1a0774-root\") pod \"node-exporter-dqszd\" (UID: \"4d7aaf65-f284-4670-8d82-f69adb1a0774\") " pod="openshift-monitoring/node-exporter-dqszd" Mar 18 16:47:36.493631 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.493610 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4d7aaf65-f284-4670-8d82-f69adb1a0774-metrics-client-ca\") pod \"node-exporter-dqszd\" (UID: \"4d7aaf65-f284-4670-8d82-f69adb1a0774\") " pod="openshift-monitoring/node-exporter-dqszd" Mar 18 16:47:36.494123 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.494096 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/556af7ea-9e15-4fb2-b326-af5947c1713a-metrics-client-ca\") pod \"openshift-state-metrics-68b5d5d464-qmgzp\" (UID: \"556af7ea-9e15-4fb2-b326-af5947c1713a\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-qmgzp" Mar 18 16:47:36.495546 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.495520 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4d7aaf65-f284-4670-8d82-f69adb1a0774-node-exporter-tls\") pod \"node-exporter-dqszd\" (UID: \"4d7aaf65-f284-4670-8d82-f69adb1a0774\") " pod="openshift-monitoring/node-exporter-dqszd" Mar 18 16:47:36.495546 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.495533 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4d7aaf65-f284-4670-8d82-f69adb1a0774-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dqszd\" (UID: \"4d7aaf65-f284-4670-8d82-f69adb1a0774\") " pod="openshift-monitoring/node-exporter-dqszd" Mar 18 16:47:36.495819 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.495798 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/556af7ea-9e15-4fb2-b326-af5947c1713a-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-68b5d5d464-qmgzp\" (UID: \"556af7ea-9e15-4fb2-b326-af5947c1713a\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-qmgzp" Mar 18 16:47:36.495984 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.495967 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/556af7ea-9e15-4fb2-b326-af5947c1713a-openshift-state-metrics-tls\") pod \"openshift-state-metrics-68b5d5d464-qmgzp\" (UID: \"556af7ea-9e15-4fb2-b326-af5947c1713a\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-qmgzp" Mar 18 16:47:36.501052 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.501030 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-th6cl\" (UniqueName: \"kubernetes.io/projected/4d7aaf65-f284-4670-8d82-f69adb1a0774-kube-api-access-th6cl\") pod \"node-exporter-dqszd\" (UID: \"4d7aaf65-f284-4670-8d82-f69adb1a0774\") " pod="openshift-monitoring/node-exporter-dqszd" Mar 18 16:47:36.501205 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.501185 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-grr29\" (UniqueName: \"kubernetes.io/projected/556af7ea-9e15-4fb2-b326-af5947c1713a-kube-api-access-grr29\") pod \"openshift-state-metrics-68b5d5d464-qmgzp\" (UID: \"556af7ea-9e15-4fb2-b326-af5947c1713a\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-qmgzp" Mar 18 16:47:36.539684 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.539662 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-qmgzp" Mar 18 16:47:36.561325 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.561305 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dqszd" Mar 18 16:47:36.568640 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:47:36.568621 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d7aaf65_f284_4670_8d82_f69adb1a0774.slice/crio-9e29af8dd9bac20b7e203e25d4cd59faa8543a79438afac4a55693de67d97edc WatchSource:0}: Error finding container 9e29af8dd9bac20b7e203e25d4cd59faa8543a79438afac4a55693de67d97edc: Status 404 returned error can't find the container with id 9e29af8dd9bac20b7e203e25d4cd59faa8543a79438afac4a55693de67d97edc Mar 18 16:47:36.654173 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.654018 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-68b5d5d464-qmgzp"] Mar 18 16:47:36.656619 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:47:36.656589 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod556af7ea_9e15_4fb2_b326_af5947c1713a.slice/crio-81854dd0bf26fa60a81abb26112e6cc11adbb5c79856efc70225bc45ee541782 WatchSource:0}: Error finding container 81854dd0bf26fa60a81abb26112e6cc11adbb5c79856efc70225bc45ee541782: Status 404 returned error can't find the container with id 81854dd0bf26fa60a81abb26112e6cc11adbb5c79856efc70225bc45ee541782 Mar 18 16:47:36.897348 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.897312 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/25bd2c32-7449-4f75-ab0e-7b815e14c3ca-kube-state-metrics-tls\") pod \"kube-state-metrics-6df7999c47-v5fw4\" (UID: \"25bd2c32-7449-4f75-ab0e-7b815e14c3ca\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-v5fw4" Mar 18 16:47:36.899591 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:36.899573 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/25bd2c32-7449-4f75-ab0e-7b815e14c3ca-kube-state-metrics-tls\") pod \"kube-state-metrics-6df7999c47-v5fw4\" (UID: \"25bd2c32-7449-4f75-ab0e-7b815e14c3ca\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-v5fw4" Mar 18 16:47:37.015246 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.015221 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-6df7999c47-v5fw4" Mar 18 16:47:37.148608 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.148574 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-6df7999c47-v5fw4"] Mar 18 16:47:37.152101 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:47:37.152063 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25bd2c32_7449_4f75_ab0e_7b815e14c3ca.slice/crio-cb17773b222cdd1db3be410131ffce8e249de472f999126f43430f9dd657fff8 WatchSource:0}: Error finding container cb17773b222cdd1db3be410131ffce8e249de472f999126f43430f9dd657fff8: Status 404 returned error can't find the container with id cb17773b222cdd1db3be410131ffce8e249de472f999126f43430f9dd657fff8 Mar 18 16:47:37.277327 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.277122 2536 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 16:47:37.280939 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.280918 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:37.283120 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.283066 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Mar 18 16:47:37.283236 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.283132 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-bdkn5\"" Mar 18 16:47:37.283236 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.283130 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Mar 18 16:47:37.283448 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.283307 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Mar 18 16:47:37.283448 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.283440 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Mar 18 16:47:37.283555 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.283540 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Mar 18 16:47:37.283761 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.283741 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Mar 18 16:47:37.283761 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.283755 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Mar 18 16:47:37.283888 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.283779 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Mar 18 16:47:37.283888 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.283742 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Mar 18 16:47:37.297022 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.297002 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 16:47:37.401992 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.401958 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/73894801-3a68-4b42-ab84-cc9a2706cb9a-config-out\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:37.402163 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.402009 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:37.402163 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.402105 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbhzk\" (UniqueName: \"kubernetes.io/projected/73894801-3a68-4b42-ab84-cc9a2706cb9a-kube-api-access-dbhzk\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:37.402163 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.402139 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73894801-3a68-4b42-ab84-cc9a2706cb9a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:37.402349 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.402193 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-config-volume\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:37.402349 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.402247 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:37.402349 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.402294 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:37.402349 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.402314 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:37.402551 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.402364 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/73894801-3a68-4b42-ab84-cc9a2706cb9a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:37.402551 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.402389 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/73894801-3a68-4b42-ab84-cc9a2706cb9a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:37.402551 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.402411 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-web-config\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:37.402695 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.402547 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/73894801-3a68-4b42-ab84-cc9a2706cb9a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:37.402695 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.402597 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:37.487951 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.487899 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-6df7999c47-v5fw4" event={"ID":"25bd2c32-7449-4f75-ab0e-7b815e14c3ca","Type":"ContainerStarted","Data":"cb17773b222cdd1db3be410131ffce8e249de472f999126f43430f9dd657fff8"} Mar 18 16:47:37.489211 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.489189 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dqszd" event={"ID":"4d7aaf65-f284-4670-8d82-f69adb1a0774","Type":"ContainerStarted","Data":"4dd32b65b8e3aa10e227e5acd6cf2a4735f0c038ef9fae85a8827cb847a5cd43"} Mar 18 16:47:37.489338 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.489220 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dqszd" event={"ID":"4d7aaf65-f284-4670-8d82-f69adb1a0774","Type":"ContainerStarted","Data":"9e29af8dd9bac20b7e203e25d4cd59faa8543a79438afac4a55693de67d97edc"} Mar 18 16:47:37.490945 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.490918 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-qmgzp" event={"ID":"556af7ea-9e15-4fb2-b326-af5947c1713a","Type":"ContainerStarted","Data":"bfdd3e6097bfdf9a2a3158ca7097e83ef462af25af75cd32bc9d613c7a2f6610"} Mar 18 16:47:37.491036 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.490948 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-qmgzp" event={"ID":"556af7ea-9e15-4fb2-b326-af5947c1713a","Type":"ContainerStarted","Data":"0101e1fdb94bcc10ca7be4efce03947fc8a9f6b9708cb0c5fb5ba7312d6e4a4a"} Mar 18 16:47:37.491036 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.490963 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-qmgzp" event={"ID":"556af7ea-9e15-4fb2-b326-af5947c1713a","Type":"ContainerStarted","Data":"81854dd0bf26fa60a81abb26112e6cc11adbb5c79856efc70225bc45ee541782"} Mar 18 16:47:37.503797 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.503776 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/73894801-3a68-4b42-ab84-cc9a2706cb9a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:37.503915 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.503810 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/73894801-3a68-4b42-ab84-cc9a2706cb9a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:37.503915 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.503829 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-web-config\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:37.503915 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.503879 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/73894801-3a68-4b42-ab84-cc9a2706cb9a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:37.503915 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.503904 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:37.504110 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.503933 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/73894801-3a68-4b42-ab84-cc9a2706cb9a-config-out\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:37.504110 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.503957 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:37.504110 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.503988 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dbhzk\" (UniqueName: \"kubernetes.io/projected/73894801-3a68-4b42-ab84-cc9a2706cb9a-kube-api-access-dbhzk\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:37.504110 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.504017 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73894801-3a68-4b42-ab84-cc9a2706cb9a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:37.504110 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.504043 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-config-volume\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:37.504110 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.504067 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:37.504110 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.504090 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:37.504480 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.504113 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:37.504480 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.504373 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/73894801-3a68-4b42-ab84-cc9a2706cb9a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:37.504770 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.504600 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/73894801-3a68-4b42-ab84-cc9a2706cb9a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:37.504770 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:47:37.504687 2536 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Mar 18 16:47:37.504770 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:47:37.504757 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-secret-alertmanager-main-tls podName:73894801-3a68-4b42-ab84-cc9a2706cb9a nodeName:}" failed. No retries permitted until 2026-03-18 16:47:38.004730336 +0000 UTC m=+180.778304906 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "73894801-3a68-4b42-ab84-cc9a2706cb9a") : secret "alertmanager-main-tls" not found Mar 18 16:47:37.505103 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.505076 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73894801-3a68-4b42-ab84-cc9a2706cb9a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:37.507628 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.507466 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:37.509243 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.508711 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/73894801-3a68-4b42-ab84-cc9a2706cb9a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:37.509243 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.508721 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:37.509243 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.509150 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-web-config\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:37.509243 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.509193 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-config-volume\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:37.509243 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.509227 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/73894801-3a68-4b42-ab84-cc9a2706cb9a-config-out\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:37.509607 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.509583 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:37.510246 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.510213 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:37.512534 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:37.512511 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbhzk\" (UniqueName: \"kubernetes.io/projected/73894801-3a68-4b42-ab84-cc9a2706cb9a-kube-api-access-dbhzk\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:38.009231 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:38.009167 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:38.011862 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:38.011836 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:38.196147 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:38.196117 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-bdkn5\"" Mar 18 16:47:38.204766 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:38.204747 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:38.495386 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:38.495356 2536 generic.go:358] "Generic (PLEG): container finished" podID="4d7aaf65-f284-4670-8d82-f69adb1a0774" containerID="4dd32b65b8e3aa10e227e5acd6cf2a4735f0c038ef9fae85a8827cb847a5cd43" exitCode=0 Mar 18 16:47:38.495487 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:38.495433 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dqszd" event={"ID":"4d7aaf65-f284-4670-8d82-f69adb1a0774","Type":"ContainerDied","Data":"4dd32b65b8e3aa10e227e5acd6cf2a4735f0c038ef9fae85a8827cb847a5cd43"} Mar 18 16:47:38.497554 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:38.497519 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-qmgzp" event={"ID":"556af7ea-9e15-4fb2-b326-af5947c1713a","Type":"ContainerStarted","Data":"c0784c8e14f0d2c1ba065bb05f60b7ea82041c8d10b449c79da2c42185862017"} Mar 18 16:47:38.532286 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:38.532224 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-qmgzp" podStartSLOduration=1.5977882650000002 podStartE2EDuration="2.53220679s" podCreationTimestamp="2026-03-18 16:47:36 +0000 UTC" firstStartedPulling="2026-03-18 16:47:36.77941714 +0000 UTC m=+179.552991717" lastFinishedPulling="2026-03-18 16:47:37.713835663 +0000 UTC m=+180.487410242" observedRunningTime="2026-03-18 16:47:38.530943405 +0000 UTC m=+181.304518004" watchObservedRunningTime="2026-03-18 16:47:38.53220679 +0000 UTC m=+181.305781379" Mar 18 16:47:38.563018 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:38.562990 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 16:47:38.567250 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:47:38.567155 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73894801_3a68_4b42_ab84_cc9a2706cb9a.slice/crio-4dd803d85f3147187cff0297b37d38cd0028503bfa757520ae8392a27d27c150 WatchSource:0}: Error finding container 4dd803d85f3147187cff0297b37d38cd0028503bfa757520ae8392a27d27c150: Status 404 returned error can't find the container with id 4dd803d85f3147187cff0297b37d38cd0028503bfa757520ae8392a27d27c150 Mar 18 16:47:39.502822 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:39.502782 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-6df7999c47-v5fw4" event={"ID":"25bd2c32-7449-4f75-ab0e-7b815e14c3ca","Type":"ContainerStarted","Data":"51dcb095b560ca719bcddc32f7ddee043ee3ce27a3a6efc66d124be98a108f7d"} Mar 18 16:47:39.502822 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:39.502826 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-6df7999c47-v5fw4" event={"ID":"25bd2c32-7449-4f75-ab0e-7b815e14c3ca","Type":"ContainerStarted","Data":"3b658b8b637e7113fa5102ec877fba39099af9a2b0e60de6cb14a0221335613c"} Mar 18 16:47:39.503349 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:39.502842 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-6df7999c47-v5fw4" event={"ID":"25bd2c32-7449-4f75-ab0e-7b815e14c3ca","Type":"ContainerStarted","Data":"676ba33f0619a5719183527a8219f8ec7f0379cb696ff0d43e68665fc76da1ab"} Mar 18 16:47:39.504083 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:39.504054 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"73894801-3a68-4b42-ab84-cc9a2706cb9a","Type":"ContainerStarted","Data":"4dd803d85f3147187cff0297b37d38cd0028503bfa757520ae8392a27d27c150"} Mar 18 16:47:39.506074 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:39.506045 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dqszd" event={"ID":"4d7aaf65-f284-4670-8d82-f69adb1a0774","Type":"ContainerStarted","Data":"90f38d8af2e4a2b501c03126f76cee8714a02ed9c88d5b9fa4e27f5bdaf05170"} Mar 18 16:47:39.506197 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:39.506086 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dqszd" event={"ID":"4d7aaf65-f284-4670-8d82-f69adb1a0774","Type":"ContainerStarted","Data":"a88dff12d989c87d95900097915c2c831ed11431afd3f24682dbb39de48935de"} Mar 18 16:47:39.523018 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:39.522964 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-6df7999c47-v5fw4" podStartSLOduration=2.193680532 podStartE2EDuration="3.522950363s" podCreationTimestamp="2026-03-18 16:47:36 +0000 UTC" firstStartedPulling="2026-03-18 16:47:37.1540531 +0000 UTC m=+179.927627667" lastFinishedPulling="2026-03-18 16:47:38.483322891 +0000 UTC m=+181.256897498" observedRunningTime="2026-03-18 16:47:39.522653344 +0000 UTC m=+182.296227925" watchObservedRunningTime="2026-03-18 16:47:39.522950363 +0000 UTC m=+182.296524950" Mar 18 16:47:39.550107 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:39.549187 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-dqszd" podStartSLOduration=2.701641362 podStartE2EDuration="3.549169574s" podCreationTimestamp="2026-03-18 16:47:36 +0000 UTC" firstStartedPulling="2026-03-18 16:47:36.570294454 +0000 UTC m=+179.343869021" lastFinishedPulling="2026-03-18 16:47:37.41782267 +0000 UTC m=+180.191397233" observedRunningTime="2026-03-18 16:47:39.547538099 +0000 UTC m=+182.321112689" watchObservedRunningTime="2026-03-18 16:47:39.549169574 +0000 UTC m=+182.322744161" Mar 18 16:47:40.510970 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:40.510927 2536 generic.go:358] "Generic (PLEG): container finished" podID="73894801-3a68-4b42-ab84-cc9a2706cb9a" containerID="a609cf0c45a4f1f2535328ca3153ded92b662a8726ba2c71ebfd7cc42306c0a2" exitCode=0 Mar 18 16:47:40.511463 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:40.511012 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"73894801-3a68-4b42-ab84-cc9a2706cb9a","Type":"ContainerDied","Data":"a609cf0c45a4f1f2535328ca3153ded92b662a8726ba2c71ebfd7cc42306c0a2"} Mar 18 16:47:41.448235 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:41.448203 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-8598bb85b4-b5rkb" Mar 18 16:47:42.380242 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.380166 2536 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 16:47:42.383740 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.383722 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.385849 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.385825 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Mar 18 16:47:42.385849 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.385846 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-b9qsevsatj8e6\"" Mar 18 16:47:42.385997 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.385846 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Mar 18 16:47:42.386169 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.386152 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Mar 18 16:47:42.386213 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.386198 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Mar 18 16:47:42.386261 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.386242 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Mar 18 16:47:42.386384 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.386368 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-tthfc\"" Mar 18 16:47:42.386607 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.386585 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Mar 18 16:47:42.386672 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.386634 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Mar 18 16:47:42.387018 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.386996 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Mar 18 16:47:42.387199 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.387005 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Mar 18 16:47:42.387199 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.387075 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Mar 18 16:47:42.387515 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.387498 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Mar 18 16:47:42.388207 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.388184 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Mar 18 16:47:42.392700 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.392258 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Mar 18 16:47:42.398009 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.397988 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 16:47:42.519746 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.519721 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"73894801-3a68-4b42-ab84-cc9a2706cb9a","Type":"ContainerStarted","Data":"b4fd04ce80fcdb4742b98bf591354f00f90c3208ed342aa74d043b8318fdf2b0"} Mar 18 16:47:42.519849 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.519752 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"73894801-3a68-4b42-ab84-cc9a2706cb9a","Type":"ContainerStarted","Data":"0243ad32a23d1feaf823fd9a11c064bb6c6a1b293ebfc100069f848832ec5c55"} Mar 18 16:47:42.519849 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.519768 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"73894801-3a68-4b42-ab84-cc9a2706cb9a","Type":"ContainerStarted","Data":"ec3c119c71e3a951218dd6104dfc7b6b1dafde1af741d630d70e0202fab7d2c0"} Mar 18 16:47:42.519849 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.519779 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"73894801-3a68-4b42-ab84-cc9a2706cb9a","Type":"ContainerStarted","Data":"ad93b6d2d40da90c4ae20f90da6da074b9374902d9a61877da86b11c622fcb11"} Mar 18 16:47:42.519849 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.519787 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"73894801-3a68-4b42-ab84-cc9a2706cb9a","Type":"ContainerStarted","Data":"e0d393b3c62b27773fb5a9f17a499496d070291526ab6fd6f60f31243e060062"} Mar 18 16:47:42.546114 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.546088 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.546245 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.546127 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.546245 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.546153 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec496183-6a46-48d9-9cd3-79969e0f09a5-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.546245 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.546235 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ec496183-6a46-48d9-9cd3-79969e0f09a5-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.546447 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.546257 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec496183-6a46-48d9-9cd3-79969e0f09a5-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.546447 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.546310 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.546447 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.546376 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.546611 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.546450 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ec496183-6a46-48d9-9cd3-79969e0f09a5-config-out\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.546611 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.546490 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.546611 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.546543 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.546611 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.546575 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec496183-6a46-48d9-9cd3-79969e0f09a5-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.546611 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.546607 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ec496183-6a46-48d9-9cd3-79969e0f09a5-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.546870 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.546632 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ec496183-6a46-48d9-9cd3-79969e0f09a5-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.546870 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.546657 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-web-config\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.546870 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.546678 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-config\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.546870 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.546700 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghwcx\" (UniqueName: \"kubernetes.io/projected/ec496183-6a46-48d9-9cd3-79969e0f09a5-kube-api-access-ghwcx\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.546870 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.546728 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.546870 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.546751 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ec496183-6a46-48d9-9cd3-79969e0f09a5-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.647624 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.647555 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-web-config\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.647624 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.647591 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-config\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.647624 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.647611 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ghwcx\" (UniqueName: \"kubernetes.io/projected/ec496183-6a46-48d9-9cd3-79969e0f09a5-kube-api-access-ghwcx\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.647869 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.647648 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.647869 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.647677 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ec496183-6a46-48d9-9cd3-79969e0f09a5-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.647869 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.647713 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.647869 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.647738 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.647869 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.647762 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec496183-6a46-48d9-9cd3-79969e0f09a5-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.647869 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.647817 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ec496183-6a46-48d9-9cd3-79969e0f09a5-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.647869 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.647843 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec496183-6a46-48d9-9cd3-79969e0f09a5-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.648381 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.647882 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.648381 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.647910 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.648381 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.647941 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ec496183-6a46-48d9-9cd3-79969e0f09a5-config-out\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.648381 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.647977 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.648381 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.648309 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ec496183-6a46-48d9-9cd3-79969e0f09a5-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.650229 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.649223 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec496183-6a46-48d9-9cd3-79969e0f09a5-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.650229 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.649312 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.650229 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.649358 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec496183-6a46-48d9-9cd3-79969e0f09a5-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.650229 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.649391 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ec496183-6a46-48d9-9cd3-79969e0f09a5-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.650229 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.649421 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ec496183-6a46-48d9-9cd3-79969e0f09a5-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.650229 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.650109 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ec496183-6a46-48d9-9cd3-79969e0f09a5-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.650589 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.650530 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-config\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.650849 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.650823 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.651348 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.651323 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.652347 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.651777 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-web-config\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.652347 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.652240 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec496183-6a46-48d9-9cd3-79969e0f09a5-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.652347 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.652298 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec496183-6a46-48d9-9cd3-79969e0f09a5-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.652347 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.652288 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.653183 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.653144 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.653501 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.653476 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ec496183-6a46-48d9-9cd3-79969e0f09a5-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.653581 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.653536 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.653581 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.653555 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.653581 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.653564 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.654285 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.654250 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ec496183-6a46-48d9-9cd3-79969e0f09a5-config-out\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.654668 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.654646 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ec496183-6a46-48d9-9cd3-79969e0f09a5-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.655640 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.655622 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghwcx\" (UniqueName: \"kubernetes.io/projected/ec496183-6a46-48d9-9cd3-79969e0f09a5-kube-api-access-ghwcx\") pod \"prometheus-k8s-0\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.696246 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.696219 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:42.886206 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:42.886178 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 16:47:42.890743 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:47:42.890709 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec496183_6a46_48d9_9cd3_79969e0f09a5.slice/crio-abab13052157044e9f89c35679d550d5f26d6f4233f746239f4d75d1ba8c16e1 WatchSource:0}: Error finding container abab13052157044e9f89c35679d550d5f26d6f4233f746239f4d75d1ba8c16e1: Status 404 returned error can't find the container with id abab13052157044e9f89c35679d550d5f26d6f4233f746239f4d75d1ba8c16e1 Mar 18 16:47:43.523772 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:43.523738 2536 generic.go:358] "Generic (PLEG): container finished" podID="ec496183-6a46-48d9-9cd3-79969e0f09a5" containerID="f35f3e6c310503f1d7efd45005c817bdb81d28100c6ba95893fea781c894f315" exitCode=0 Mar 18 16:47:43.524200 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:43.523822 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ec496183-6a46-48d9-9cd3-79969e0f09a5","Type":"ContainerDied","Data":"f35f3e6c310503f1d7efd45005c817bdb81d28100c6ba95893fea781c894f315"} Mar 18 16:47:43.524200 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:43.523859 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ec496183-6a46-48d9-9cd3-79969e0f09a5","Type":"ContainerStarted","Data":"abab13052157044e9f89c35679d550d5f26d6f4233f746239f4d75d1ba8c16e1"} Mar 18 16:47:43.526815 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:43.526799 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"73894801-3a68-4b42-ab84-cc9a2706cb9a","Type":"ContainerStarted","Data":"69584e2135ce10ecc7ea3050e8584d2a3c98af82da41b26a136c14c3f1958d42"} Mar 18 16:47:43.579420 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:43.579383 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.33756116 podStartE2EDuration="6.579370307s" podCreationTimestamp="2026-03-18 16:47:37 +0000 UTC" firstStartedPulling="2026-03-18 16:47:38.569207632 +0000 UTC m=+181.342782201" lastFinishedPulling="2026-03-18 16:47:42.811016784 +0000 UTC m=+185.584591348" observedRunningTime="2026-03-18 16:47:43.577585911 +0000 UTC m=+186.351160499" watchObservedRunningTime="2026-03-18 16:47:43.579370307 +0000 UTC m=+186.352944944" Mar 18 16:47:43.906550 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:43.906476 2536 scope.go:117] "RemoveContainer" containerID="2eecbd7aad2517e638c2f2ffa41628d27302acd99633acfdb5892c6e1ffa75fd" Mar 18 16:47:44.532325 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:44.532292 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-mx9sz_95e73a3f-8a85-403f-b00b-17524a80b500/console-operator/2.log" Mar 18 16:47:44.532755 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:44.532438 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-76b8565867-mx9sz" event={"ID":"95e73a3f-8a85-403f-b00b-17524a80b500","Type":"ContainerStarted","Data":"d961a2a957945420882a3f8e1ce18517c5c5b9a93779bf856e9baf91dd114484"} Mar 18 16:47:44.533107 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:44.533083 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-76b8565867-mx9sz" Mar 18 16:47:44.549180 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:44.548933 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-76b8565867-mx9sz" podStartSLOduration=53.936639731 podStartE2EDuration="55.548881438s" podCreationTimestamp="2026-03-18 16:46:49 +0000 UTC" firstStartedPulling="2026-03-18 16:46:49.787363126 +0000 UTC m=+132.560937689" lastFinishedPulling="2026-03-18 16:46:51.399604833 +0000 UTC m=+134.173179396" observedRunningTime="2026-03-18 16:47:44.548765824 +0000 UTC m=+187.322340411" watchObservedRunningTime="2026-03-18 16:47:44.548881438 +0000 UTC m=+187.322456026" Mar 18 16:47:45.533301 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:45.533241 2536 patch_prober.go:28] interesting pod/console-operator-76b8565867-mx9sz container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.133.0.9:8443/readyz\": context deadline exceeded" start-of-body= Mar 18 16:47:45.533793 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:45.533326 2536 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-76b8565867-mx9sz" podUID="95e73a3f-8a85-403f-b00b-17524a80b500" containerName="console-operator" probeResult="failure" output="Get \"https://10.133.0.9:8443/readyz\": context deadline exceeded" Mar 18 16:47:45.596017 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:45.595991 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-76b8565867-mx9sz" Mar 18 16:47:46.540511 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:46.540433 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ec496183-6a46-48d9-9cd3-79969e0f09a5","Type":"ContainerStarted","Data":"bc8804455305327453ae7a4611ccbfac4205b6416e35786217b763457495f68d"} Mar 18 16:47:46.540511 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:46.540476 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ec496183-6a46-48d9-9cd3-79969e0f09a5","Type":"ContainerStarted","Data":"e88e3863249bf414fa9d4d3a62ab8c792d4da149b3408350453212f59e45b004"} Mar 18 16:47:48.549024 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:48.548994 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ec496183-6a46-48d9-9cd3-79969e0f09a5","Type":"ContainerStarted","Data":"84cbac1d887142ae76b0efe704614130870d0bfb7cfb3979367e24203a6cbe97"} Mar 18 16:47:48.549413 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:48.549029 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ec496183-6a46-48d9-9cd3-79969e0f09a5","Type":"ContainerStarted","Data":"68ba1d01581acec4ce60fdfef6c83a79c8cc12070e86c7b1dff3fbf9f9699fcc"} Mar 18 16:47:48.549413 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:48.549039 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ec496183-6a46-48d9-9cd3-79969e0f09a5","Type":"ContainerStarted","Data":"66e77476d47e8a3093be7bdaf14b45a2cfcbb87c85889307c75534c9f312f23f"} Mar 18 16:47:48.549413 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:48.549047 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ec496183-6a46-48d9-9cd3-79969e0f09a5","Type":"ContainerStarted","Data":"0866c607fbe8d1fb480a9dbe3a0aca31fe956a926a4ccfe67210cafe8188b60a"} Mar 18 16:47:48.574883 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:48.574840 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.076680081 podStartE2EDuration="6.574824784s" podCreationTimestamp="2026-03-18 16:47:42 +0000 UTC" firstStartedPulling="2026-03-18 16:47:43.525161961 +0000 UTC m=+186.298736525" lastFinishedPulling="2026-03-18 16:47:48.023306663 +0000 UTC m=+190.796881228" observedRunningTime="2026-03-18 16:47:48.573569013 +0000 UTC m=+191.347143589" watchObservedRunningTime="2026-03-18 16:47:48.574824784 +0000 UTC m=+191.348399370" Mar 18 16:47:52.697153 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:47:52.697121 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.697218 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:42.697188 2536 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.713062 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:42.713038 2536 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.727122 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:42.727098 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:48.609411 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:48.609362 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/275a2fa6-277f-40dc-a2bc-749a97550e2e-metrics-certs\") pod \"network-metrics-daemon-rjx6m\" (UID: \"275a2fa6-277f-40dc-a2bc-749a97550e2e\") " pod="openshift-multus/network-metrics-daemon-rjx6m" Mar 18 16:48:48.611746 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:48.611719 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/275a2fa6-277f-40dc-a2bc-749a97550e2e-metrics-certs\") pod \"network-metrics-daemon-rjx6m\" (UID: \"275a2fa6-277f-40dc-a2bc-749a97550e2e\") " pod="openshift-multus/network-metrics-daemon-rjx6m" Mar 18 16:48:48.909348 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:48.909259 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-w8h72\"" Mar 18 16:48:48.917839 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:48.917806 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rjx6m" Mar 18 16:48:49.039515 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:49.039492 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rjx6m"] Mar 18 16:48:49.041813 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:48:49.041788 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod275a2fa6_277f_40dc_a2bc_749a97550e2e.slice/crio-2e09a19ea5a8c7c6237006aeed389b2af3bbb5f4f17704d743c511444a553479 WatchSource:0}: Error finding container 2e09a19ea5a8c7c6237006aeed389b2af3bbb5f4f17704d743c511444a553479: Status 404 returned error can't find the container with id 2e09a19ea5a8c7c6237006aeed389b2af3bbb5f4f17704d743c511444a553479 Mar 18 16:48:49.730128 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:49.730079 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rjx6m" event={"ID":"275a2fa6-277f-40dc-a2bc-749a97550e2e","Type":"ContainerStarted","Data":"2e09a19ea5a8c7c6237006aeed389b2af3bbb5f4f17704d743c511444a553479"} Mar 18 16:48:50.734771 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:50.734734 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rjx6m" event={"ID":"275a2fa6-277f-40dc-a2bc-749a97550e2e","Type":"ContainerStarted","Data":"6563a5fc7b6a181530a9373deae1019a2529c0c1d285aed20f4d3e4f47c46dc3"} Mar 18 16:48:50.735168 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:50.734777 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rjx6m" event={"ID":"275a2fa6-277f-40dc-a2bc-749a97550e2e","Type":"ContainerStarted","Data":"4a90d5dc06b503d837a3625b0ea4a3bdb6801c137f67827cb15fa8e2e503ae61"} Mar 18 16:48:50.752663 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:50.752614 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-rjx6m" podStartSLOduration=252.722677097 podStartE2EDuration="4m13.75260006s" podCreationTimestamp="2026-03-18 16:44:37 +0000 UTC" firstStartedPulling="2026-03-18 16:48:49.043666104 +0000 UTC m=+251.817240668" lastFinishedPulling="2026-03-18 16:48:50.073589066 +0000 UTC m=+252.847163631" observedRunningTime="2026-03-18 16:48:50.751171687 +0000 UTC m=+253.524746273" watchObservedRunningTime="2026-03-18 16:48:50.75260006 +0000 UTC m=+253.526174654" Mar 18 16:48:56.481895 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:56.481799 2536 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 16:48:56.482462 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:56.482392 2536 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="73894801-3a68-4b42-ab84-cc9a2706cb9a" containerName="alertmanager" containerID="cri-o://e0d393b3c62b27773fb5a9f17a499496d070291526ab6fd6f60f31243e060062" gracePeriod=120 Mar 18 16:48:56.482531 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:56.482436 2536 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="73894801-3a68-4b42-ab84-cc9a2706cb9a" containerName="kube-rbac-proxy" containerID="cri-o://0243ad32a23d1feaf823fd9a11c064bb6c6a1b293ebfc100069f848832ec5c55" gracePeriod=120 Mar 18 16:48:56.482531 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:56.482470 2536 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="73894801-3a68-4b42-ab84-cc9a2706cb9a" containerName="kube-rbac-proxy-metric" containerID="cri-o://b4fd04ce80fcdb4742b98bf591354f00f90c3208ed342aa74d043b8318fdf2b0" gracePeriod=120 Mar 18 16:48:56.482531 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:56.482480 2536 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="73894801-3a68-4b42-ab84-cc9a2706cb9a" containerName="kube-rbac-proxy-web" containerID="cri-o://ec3c119c71e3a951218dd6104dfc7b6b1dafde1af741d630d70e0202fab7d2c0" gracePeriod=120 Mar 18 16:48:56.482531 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:56.482496 2536 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="73894801-3a68-4b42-ab84-cc9a2706cb9a" containerName="config-reloader" containerID="cri-o://ad93b6d2d40da90c4ae20f90da6da074b9374902d9a61877da86b11c622fcb11" gracePeriod=120 Mar 18 16:48:56.482718 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:56.482444 2536 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="73894801-3a68-4b42-ab84-cc9a2706cb9a" containerName="prom-label-proxy" containerID="cri-o://69584e2135ce10ecc7ea3050e8584d2a3c98af82da41b26a136c14c3f1958d42" gracePeriod=120 Mar 18 16:48:56.757964 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:56.757890 2536 generic.go:358] "Generic (PLEG): container finished" podID="73894801-3a68-4b42-ab84-cc9a2706cb9a" containerID="69584e2135ce10ecc7ea3050e8584d2a3c98af82da41b26a136c14c3f1958d42" exitCode=0 Mar 18 16:48:56.757964 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:56.757913 2536 generic.go:358] "Generic (PLEG): container finished" podID="73894801-3a68-4b42-ab84-cc9a2706cb9a" containerID="0243ad32a23d1feaf823fd9a11c064bb6c6a1b293ebfc100069f848832ec5c55" exitCode=0 Mar 18 16:48:56.757964 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:56.757919 2536 generic.go:358] "Generic (PLEG): container finished" podID="73894801-3a68-4b42-ab84-cc9a2706cb9a" containerID="ad93b6d2d40da90c4ae20f90da6da074b9374902d9a61877da86b11c622fcb11" exitCode=0 Mar 18 16:48:56.757964 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:56.757926 2536 generic.go:358] "Generic (PLEG): container finished" podID="73894801-3a68-4b42-ab84-cc9a2706cb9a" containerID="e0d393b3c62b27773fb5a9f17a499496d070291526ab6fd6f60f31243e060062" exitCode=0 Mar 18 16:48:56.758191 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:56.757955 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"73894801-3a68-4b42-ab84-cc9a2706cb9a","Type":"ContainerDied","Data":"69584e2135ce10ecc7ea3050e8584d2a3c98af82da41b26a136c14c3f1958d42"} Mar 18 16:48:56.758191 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:56.757989 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"73894801-3a68-4b42-ab84-cc9a2706cb9a","Type":"ContainerDied","Data":"0243ad32a23d1feaf823fd9a11c064bb6c6a1b293ebfc100069f848832ec5c55"} Mar 18 16:48:56.758191 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:56.758002 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"73894801-3a68-4b42-ab84-cc9a2706cb9a","Type":"ContainerDied","Data":"ad93b6d2d40da90c4ae20f90da6da074b9374902d9a61877da86b11c622fcb11"} Mar 18 16:48:56.758191 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:56.758011 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"73894801-3a68-4b42-ab84-cc9a2706cb9a","Type":"ContainerDied","Data":"e0d393b3c62b27773fb5a9f17a499496d070291526ab6fd6f60f31243e060062"} Mar 18 16:48:57.718107 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.718086 2536 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:57.764943 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.764910 2536 generic.go:358] "Generic (PLEG): container finished" podID="73894801-3a68-4b42-ab84-cc9a2706cb9a" containerID="b4fd04ce80fcdb4742b98bf591354f00f90c3208ed342aa74d043b8318fdf2b0" exitCode=0 Mar 18 16:48:57.764943 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.764938 2536 generic.go:358] "Generic (PLEG): container finished" podID="73894801-3a68-4b42-ab84-cc9a2706cb9a" containerID="ec3c119c71e3a951218dd6104dfc7b6b1dafde1af741d630d70e0202fab7d2c0" exitCode=0 Mar 18 16:48:57.765180 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.764982 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"73894801-3a68-4b42-ab84-cc9a2706cb9a","Type":"ContainerDied","Data":"b4fd04ce80fcdb4742b98bf591354f00f90c3208ed342aa74d043b8318fdf2b0"} Mar 18 16:48:57.765180 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.765011 2536 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:57.765180 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.765025 2536 scope.go:117] "RemoveContainer" containerID="69584e2135ce10ecc7ea3050e8584d2a3c98af82da41b26a136c14c3f1958d42" Mar 18 16:48:57.765180 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.765015 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"73894801-3a68-4b42-ab84-cc9a2706cb9a","Type":"ContainerDied","Data":"ec3c119c71e3a951218dd6104dfc7b6b1dafde1af741d630d70e0202fab7d2c0"} Mar 18 16:48:57.765180 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.765156 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"73894801-3a68-4b42-ab84-cc9a2706cb9a","Type":"ContainerDied","Data":"4dd803d85f3147187cff0297b37d38cd0028503bfa757520ae8392a27d27c150"} Mar 18 16:48:57.772987 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.772959 2536 scope.go:117] "RemoveContainer" containerID="b4fd04ce80fcdb4742b98bf591354f00f90c3208ed342aa74d043b8318fdf2b0" Mar 18 16:48:57.780358 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.780341 2536 scope.go:117] "RemoveContainer" containerID="0243ad32a23d1feaf823fd9a11c064bb6c6a1b293ebfc100069f848832ec5c55" Mar 18 16:48:57.782353 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.782333 2536 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/73894801-3a68-4b42-ab84-cc9a2706cb9a-tls-assets\") pod \"73894801-3a68-4b42-ab84-cc9a2706cb9a\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " Mar 18 16:48:57.782460 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.782376 2536 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/73894801-3a68-4b42-ab84-cc9a2706cb9a-config-out\") pod \"73894801-3a68-4b42-ab84-cc9a2706cb9a\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " Mar 18 16:48:57.782460 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.782403 2536 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-config-volume\") pod \"73894801-3a68-4b42-ab84-cc9a2706cb9a\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " Mar 18 16:48:57.782460 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.782425 2536 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-cluster-tls-config\") pod \"73894801-3a68-4b42-ab84-cc9a2706cb9a\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " Mar 18 16:48:57.782622 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.782460 2536 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbhzk\" (UniqueName: \"kubernetes.io/projected/73894801-3a68-4b42-ab84-cc9a2706cb9a-kube-api-access-dbhzk\") pod \"73894801-3a68-4b42-ab84-cc9a2706cb9a\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " Mar 18 16:48:57.782622 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.782496 2536 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-secret-alertmanager-main-tls\") pod \"73894801-3a68-4b42-ab84-cc9a2706cb9a\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " Mar 18 16:48:57.782622 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.782562 2536 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/73894801-3a68-4b42-ab84-cc9a2706cb9a-metrics-client-ca\") pod \"73894801-3a68-4b42-ab84-cc9a2706cb9a\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " Mar 18 16:48:57.782766 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.782618 2536 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-secret-alertmanager-kube-rbac-proxy-web\") pod \"73894801-3a68-4b42-ab84-cc9a2706cb9a\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " Mar 18 16:48:57.782766 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.782648 2536 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73894801-3a68-4b42-ab84-cc9a2706cb9a-alertmanager-trusted-ca-bundle\") pod \"73894801-3a68-4b42-ab84-cc9a2706cb9a\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " Mar 18 16:48:57.782766 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.782674 2536 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-web-config\") pod \"73894801-3a68-4b42-ab84-cc9a2706cb9a\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " Mar 18 16:48:57.782766 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.782707 2536 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"73894801-3a68-4b42-ab84-cc9a2706cb9a\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " Mar 18 16:48:57.782766 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.782734 2536 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-secret-alertmanager-kube-rbac-proxy\") pod \"73894801-3a68-4b42-ab84-cc9a2706cb9a\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " Mar 18 16:48:57.782987 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.782800 2536 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/73894801-3a68-4b42-ab84-cc9a2706cb9a-alertmanager-main-db\") pod \"73894801-3a68-4b42-ab84-cc9a2706cb9a\" (UID: \"73894801-3a68-4b42-ab84-cc9a2706cb9a\") " Mar 18 16:48:57.783060 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.783032 2536 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73894801-3a68-4b42-ab84-cc9a2706cb9a-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "73894801-3a68-4b42-ab84-cc9a2706cb9a" (UID: "73894801-3a68-4b42-ab84-cc9a2706cb9a"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:48:57.783515 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.783469 2536 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73894801-3a68-4b42-ab84-cc9a2706cb9a-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "73894801-3a68-4b42-ab84-cc9a2706cb9a" (UID: "73894801-3a68-4b42-ab84-cc9a2706cb9a"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:48:57.783703 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.783676 2536 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73894801-3a68-4b42-ab84-cc9a2706cb9a-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "73894801-3a68-4b42-ab84-cc9a2706cb9a" (UID: "73894801-3a68-4b42-ab84-cc9a2706cb9a"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:48:57.785324 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.785282 2536 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73894801-3a68-4b42-ab84-cc9a2706cb9a-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "73894801-3a68-4b42-ab84-cc9a2706cb9a" (UID: "73894801-3a68-4b42-ab84-cc9a2706cb9a"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:48:57.786014 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.785973 2536 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "73894801-3a68-4b42-ab84-cc9a2706cb9a" (UID: "73894801-3a68-4b42-ab84-cc9a2706cb9a"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:57.787234 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.787207 2536 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73894801-3a68-4b42-ab84-cc9a2706cb9a-config-out" (OuterVolumeSpecName: "config-out") pod "73894801-3a68-4b42-ab84-cc9a2706cb9a" (UID: "73894801-3a68-4b42-ab84-cc9a2706cb9a"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:48:57.787374 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.787348 2536 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "73894801-3a68-4b42-ab84-cc9a2706cb9a" (UID: "73894801-3a68-4b42-ab84-cc9a2706cb9a"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:57.787440 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.787372 2536 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "73894801-3a68-4b42-ab84-cc9a2706cb9a" (UID: "73894801-3a68-4b42-ab84-cc9a2706cb9a"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:57.787563 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.787539 2536 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73894801-3a68-4b42-ab84-cc9a2706cb9a-kube-api-access-dbhzk" (OuterVolumeSpecName: "kube-api-access-dbhzk") pod "73894801-3a68-4b42-ab84-cc9a2706cb9a" (UID: "73894801-3a68-4b42-ab84-cc9a2706cb9a"). InnerVolumeSpecName "kube-api-access-dbhzk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:48:57.788020 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.787993 2536 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "73894801-3a68-4b42-ab84-cc9a2706cb9a" (UID: "73894801-3a68-4b42-ab84-cc9a2706cb9a"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:57.790217 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.790186 2536 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-config-volume" (OuterVolumeSpecName: "config-volume") pod "73894801-3a68-4b42-ab84-cc9a2706cb9a" (UID: "73894801-3a68-4b42-ab84-cc9a2706cb9a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:57.793450 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.793430 2536 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "73894801-3a68-4b42-ab84-cc9a2706cb9a" (UID: "73894801-3a68-4b42-ab84-cc9a2706cb9a"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:57.793531 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.793449 2536 scope.go:117] "RemoveContainer" containerID="ec3c119c71e3a951218dd6104dfc7b6b1dafde1af741d630d70e0202fab7d2c0" Mar 18 16:48:57.800510 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.800491 2536 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-web-config" (OuterVolumeSpecName: "web-config") pod "73894801-3a68-4b42-ab84-cc9a2706cb9a" (UID: "73894801-3a68-4b42-ab84-cc9a2706cb9a"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:57.808136 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.808117 2536 scope.go:117] "RemoveContainer" containerID="ad93b6d2d40da90c4ae20f90da6da074b9374902d9a61877da86b11c622fcb11" Mar 18 16:48:57.814591 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.814575 2536 scope.go:117] "RemoveContainer" containerID="e0d393b3c62b27773fb5a9f17a499496d070291526ab6fd6f60f31243e060062" Mar 18 16:48:57.820775 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.820758 2536 scope.go:117] "RemoveContainer" containerID="a609cf0c45a4f1f2535328ca3153ded92b662a8726ba2c71ebfd7cc42306c0a2" Mar 18 16:48:57.827081 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.827059 2536 scope.go:117] "RemoveContainer" containerID="69584e2135ce10ecc7ea3050e8584d2a3c98af82da41b26a136c14c3f1958d42" Mar 18 16:48:57.827402 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:48:57.827383 2536 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69584e2135ce10ecc7ea3050e8584d2a3c98af82da41b26a136c14c3f1958d42\": container with ID starting with 69584e2135ce10ecc7ea3050e8584d2a3c98af82da41b26a136c14c3f1958d42 not found: ID does not exist" containerID="69584e2135ce10ecc7ea3050e8584d2a3c98af82da41b26a136c14c3f1958d42" Mar 18 16:48:57.827479 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.827413 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69584e2135ce10ecc7ea3050e8584d2a3c98af82da41b26a136c14c3f1958d42"} err="failed to get container status \"69584e2135ce10ecc7ea3050e8584d2a3c98af82da41b26a136c14c3f1958d42\": rpc error: code = NotFound desc = could not find container \"69584e2135ce10ecc7ea3050e8584d2a3c98af82da41b26a136c14c3f1958d42\": container with ID starting with 69584e2135ce10ecc7ea3050e8584d2a3c98af82da41b26a136c14c3f1958d42 not found: ID does not exist" Mar 18 16:48:57.827479 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.827460 2536 scope.go:117] "RemoveContainer" containerID="b4fd04ce80fcdb4742b98bf591354f00f90c3208ed342aa74d043b8318fdf2b0" Mar 18 16:48:57.827696 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:48:57.827679 2536 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4fd04ce80fcdb4742b98bf591354f00f90c3208ed342aa74d043b8318fdf2b0\": container with ID starting with b4fd04ce80fcdb4742b98bf591354f00f90c3208ed342aa74d043b8318fdf2b0 not found: ID does not exist" containerID="b4fd04ce80fcdb4742b98bf591354f00f90c3208ed342aa74d043b8318fdf2b0" Mar 18 16:48:57.827737 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.827701 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4fd04ce80fcdb4742b98bf591354f00f90c3208ed342aa74d043b8318fdf2b0"} err="failed to get container status \"b4fd04ce80fcdb4742b98bf591354f00f90c3208ed342aa74d043b8318fdf2b0\": rpc error: code = NotFound desc = could not find container \"b4fd04ce80fcdb4742b98bf591354f00f90c3208ed342aa74d043b8318fdf2b0\": container with ID starting with b4fd04ce80fcdb4742b98bf591354f00f90c3208ed342aa74d043b8318fdf2b0 not found: ID does not exist" Mar 18 16:48:57.827737 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.827718 2536 scope.go:117] "RemoveContainer" containerID="0243ad32a23d1feaf823fd9a11c064bb6c6a1b293ebfc100069f848832ec5c55" Mar 18 16:48:57.827921 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:48:57.827905 2536 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0243ad32a23d1feaf823fd9a11c064bb6c6a1b293ebfc100069f848832ec5c55\": container with ID starting with 0243ad32a23d1feaf823fd9a11c064bb6c6a1b293ebfc100069f848832ec5c55 not found: ID does not exist" containerID="0243ad32a23d1feaf823fd9a11c064bb6c6a1b293ebfc100069f848832ec5c55" Mar 18 16:48:57.827959 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.827926 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0243ad32a23d1feaf823fd9a11c064bb6c6a1b293ebfc100069f848832ec5c55"} err="failed to get container status \"0243ad32a23d1feaf823fd9a11c064bb6c6a1b293ebfc100069f848832ec5c55\": rpc error: code = NotFound desc = could not find container \"0243ad32a23d1feaf823fd9a11c064bb6c6a1b293ebfc100069f848832ec5c55\": container with ID starting with 0243ad32a23d1feaf823fd9a11c064bb6c6a1b293ebfc100069f848832ec5c55 not found: ID does not exist" Mar 18 16:48:57.827959 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.827938 2536 scope.go:117] "RemoveContainer" containerID="ec3c119c71e3a951218dd6104dfc7b6b1dafde1af741d630d70e0202fab7d2c0" Mar 18 16:48:57.828119 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:48:57.828103 2536 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec3c119c71e3a951218dd6104dfc7b6b1dafde1af741d630d70e0202fab7d2c0\": container with ID starting with ec3c119c71e3a951218dd6104dfc7b6b1dafde1af741d630d70e0202fab7d2c0 not found: ID does not exist" containerID="ec3c119c71e3a951218dd6104dfc7b6b1dafde1af741d630d70e0202fab7d2c0" Mar 18 16:48:57.828180 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.828129 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec3c119c71e3a951218dd6104dfc7b6b1dafde1af741d630d70e0202fab7d2c0"} err="failed to get container status \"ec3c119c71e3a951218dd6104dfc7b6b1dafde1af741d630d70e0202fab7d2c0\": rpc error: code = NotFound desc = could not find container \"ec3c119c71e3a951218dd6104dfc7b6b1dafde1af741d630d70e0202fab7d2c0\": container with ID starting with ec3c119c71e3a951218dd6104dfc7b6b1dafde1af741d630d70e0202fab7d2c0 not found: ID does not exist" Mar 18 16:48:57.828180 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.828149 2536 scope.go:117] "RemoveContainer" containerID="ad93b6d2d40da90c4ae20f90da6da074b9374902d9a61877da86b11c622fcb11" Mar 18 16:48:57.828371 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:48:57.828355 2536 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad93b6d2d40da90c4ae20f90da6da074b9374902d9a61877da86b11c622fcb11\": container with ID starting with ad93b6d2d40da90c4ae20f90da6da074b9374902d9a61877da86b11c622fcb11 not found: ID does not exist" containerID="ad93b6d2d40da90c4ae20f90da6da074b9374902d9a61877da86b11c622fcb11" Mar 18 16:48:57.828415 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.828374 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad93b6d2d40da90c4ae20f90da6da074b9374902d9a61877da86b11c622fcb11"} err="failed to get container status \"ad93b6d2d40da90c4ae20f90da6da074b9374902d9a61877da86b11c622fcb11\": rpc error: code = NotFound desc = could not find container \"ad93b6d2d40da90c4ae20f90da6da074b9374902d9a61877da86b11c622fcb11\": container with ID starting with ad93b6d2d40da90c4ae20f90da6da074b9374902d9a61877da86b11c622fcb11 not found: ID does not exist" Mar 18 16:48:57.828415 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.828386 2536 scope.go:117] "RemoveContainer" containerID="e0d393b3c62b27773fb5a9f17a499496d070291526ab6fd6f60f31243e060062" Mar 18 16:48:57.828578 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:48:57.828564 2536 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0d393b3c62b27773fb5a9f17a499496d070291526ab6fd6f60f31243e060062\": container with ID starting with e0d393b3c62b27773fb5a9f17a499496d070291526ab6fd6f60f31243e060062 not found: ID does not exist" containerID="e0d393b3c62b27773fb5a9f17a499496d070291526ab6fd6f60f31243e060062" Mar 18 16:48:57.828619 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.828582 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0d393b3c62b27773fb5a9f17a499496d070291526ab6fd6f60f31243e060062"} err="failed to get container status \"e0d393b3c62b27773fb5a9f17a499496d070291526ab6fd6f60f31243e060062\": rpc error: code = NotFound desc = could not find container \"e0d393b3c62b27773fb5a9f17a499496d070291526ab6fd6f60f31243e060062\": container with ID starting with e0d393b3c62b27773fb5a9f17a499496d070291526ab6fd6f60f31243e060062 not found: ID does not exist" Mar 18 16:48:57.828619 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.828593 2536 scope.go:117] "RemoveContainer" containerID="a609cf0c45a4f1f2535328ca3153ded92b662a8726ba2c71ebfd7cc42306c0a2" Mar 18 16:48:57.828761 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:48:57.828747 2536 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a609cf0c45a4f1f2535328ca3153ded92b662a8726ba2c71ebfd7cc42306c0a2\": container with ID starting with a609cf0c45a4f1f2535328ca3153ded92b662a8726ba2c71ebfd7cc42306c0a2 not found: ID does not exist" containerID="a609cf0c45a4f1f2535328ca3153ded92b662a8726ba2c71ebfd7cc42306c0a2" Mar 18 16:48:57.828801 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.828763 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a609cf0c45a4f1f2535328ca3153ded92b662a8726ba2c71ebfd7cc42306c0a2"} err="failed to get container status \"a609cf0c45a4f1f2535328ca3153ded92b662a8726ba2c71ebfd7cc42306c0a2\": rpc error: code = NotFound desc = could not find container \"a609cf0c45a4f1f2535328ca3153ded92b662a8726ba2c71ebfd7cc42306c0a2\": container with ID starting with a609cf0c45a4f1f2535328ca3153ded92b662a8726ba2c71ebfd7cc42306c0a2 not found: ID does not exist" Mar 18 16:48:57.828801 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.828774 2536 scope.go:117] "RemoveContainer" containerID="69584e2135ce10ecc7ea3050e8584d2a3c98af82da41b26a136c14c3f1958d42" Mar 18 16:48:57.828988 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.828971 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69584e2135ce10ecc7ea3050e8584d2a3c98af82da41b26a136c14c3f1958d42"} err="failed to get container status \"69584e2135ce10ecc7ea3050e8584d2a3c98af82da41b26a136c14c3f1958d42\": rpc error: code = NotFound desc = could not find container \"69584e2135ce10ecc7ea3050e8584d2a3c98af82da41b26a136c14c3f1958d42\": container with ID starting with 69584e2135ce10ecc7ea3050e8584d2a3c98af82da41b26a136c14c3f1958d42 not found: ID does not exist" Mar 18 16:48:57.828988 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.828987 2536 scope.go:117] "RemoveContainer" containerID="b4fd04ce80fcdb4742b98bf591354f00f90c3208ed342aa74d043b8318fdf2b0" Mar 18 16:48:57.829172 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.829154 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4fd04ce80fcdb4742b98bf591354f00f90c3208ed342aa74d043b8318fdf2b0"} err="failed to get container status \"b4fd04ce80fcdb4742b98bf591354f00f90c3208ed342aa74d043b8318fdf2b0\": rpc error: code = NotFound desc = could not find container \"b4fd04ce80fcdb4742b98bf591354f00f90c3208ed342aa74d043b8318fdf2b0\": container with ID starting with b4fd04ce80fcdb4742b98bf591354f00f90c3208ed342aa74d043b8318fdf2b0 not found: ID does not exist" Mar 18 16:48:57.829216 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.829173 2536 scope.go:117] "RemoveContainer" containerID="0243ad32a23d1feaf823fd9a11c064bb6c6a1b293ebfc100069f848832ec5c55" Mar 18 16:48:57.829409 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.829391 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0243ad32a23d1feaf823fd9a11c064bb6c6a1b293ebfc100069f848832ec5c55"} err="failed to get container status \"0243ad32a23d1feaf823fd9a11c064bb6c6a1b293ebfc100069f848832ec5c55\": rpc error: code = NotFound desc = could not find container \"0243ad32a23d1feaf823fd9a11c064bb6c6a1b293ebfc100069f848832ec5c55\": container with ID starting with 0243ad32a23d1feaf823fd9a11c064bb6c6a1b293ebfc100069f848832ec5c55 not found: ID does not exist" Mar 18 16:48:57.829475 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.829409 2536 scope.go:117] "RemoveContainer" containerID="ec3c119c71e3a951218dd6104dfc7b6b1dafde1af741d630d70e0202fab7d2c0" Mar 18 16:48:57.829609 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.829590 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec3c119c71e3a951218dd6104dfc7b6b1dafde1af741d630d70e0202fab7d2c0"} err="failed to get container status \"ec3c119c71e3a951218dd6104dfc7b6b1dafde1af741d630d70e0202fab7d2c0\": rpc error: code = NotFound desc = could not find container \"ec3c119c71e3a951218dd6104dfc7b6b1dafde1af741d630d70e0202fab7d2c0\": container with ID starting with ec3c119c71e3a951218dd6104dfc7b6b1dafde1af741d630d70e0202fab7d2c0 not found: ID does not exist" Mar 18 16:48:57.829646 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.829611 2536 scope.go:117] "RemoveContainer" containerID="ad93b6d2d40da90c4ae20f90da6da074b9374902d9a61877da86b11c622fcb11" Mar 18 16:48:57.829847 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.829828 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad93b6d2d40da90c4ae20f90da6da074b9374902d9a61877da86b11c622fcb11"} err="failed to get container status \"ad93b6d2d40da90c4ae20f90da6da074b9374902d9a61877da86b11c622fcb11\": rpc error: code = NotFound desc = could not find container \"ad93b6d2d40da90c4ae20f90da6da074b9374902d9a61877da86b11c622fcb11\": container with ID starting with ad93b6d2d40da90c4ae20f90da6da074b9374902d9a61877da86b11c622fcb11 not found: ID does not exist" Mar 18 16:48:57.829847 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.829848 2536 scope.go:117] "RemoveContainer" containerID="e0d393b3c62b27773fb5a9f17a499496d070291526ab6fd6f60f31243e060062" Mar 18 16:48:57.830045 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.830030 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0d393b3c62b27773fb5a9f17a499496d070291526ab6fd6f60f31243e060062"} err="failed to get container status \"e0d393b3c62b27773fb5a9f17a499496d070291526ab6fd6f60f31243e060062\": rpc error: code = NotFound desc = could not find container \"e0d393b3c62b27773fb5a9f17a499496d070291526ab6fd6f60f31243e060062\": container with ID starting with e0d393b3c62b27773fb5a9f17a499496d070291526ab6fd6f60f31243e060062 not found: ID does not exist" Mar 18 16:48:57.830045 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.830043 2536 scope.go:117] "RemoveContainer" containerID="a609cf0c45a4f1f2535328ca3153ded92b662a8726ba2c71ebfd7cc42306c0a2" Mar 18 16:48:57.830231 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.830215 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a609cf0c45a4f1f2535328ca3153ded92b662a8726ba2c71ebfd7cc42306c0a2"} err="failed to get container status \"a609cf0c45a4f1f2535328ca3153ded92b662a8726ba2c71ebfd7cc42306c0a2\": rpc error: code = NotFound desc = could not find container \"a609cf0c45a4f1f2535328ca3153ded92b662a8726ba2c71ebfd7cc42306c0a2\": container with ID starting with a609cf0c45a4f1f2535328ca3153ded92b662a8726ba2c71ebfd7cc42306c0a2 not found: ID does not exist" Mar 18 16:48:57.884130 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.884109 2536 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-secret-alertmanager-main-tls\") on node \"ip-10-0-132-224.ec2.internal\" DevicePath \"\"" Mar 18 16:48:57.884130 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.884129 2536 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/73894801-3a68-4b42-ab84-cc9a2706cb9a-metrics-client-ca\") on node \"ip-10-0-132-224.ec2.internal\" DevicePath \"\"" Mar 18 16:48:57.884262 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.884141 2536 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-132-224.ec2.internal\" DevicePath \"\"" Mar 18 16:48:57.884262 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.884154 2536 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73894801-3a68-4b42-ab84-cc9a2706cb9a-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-132-224.ec2.internal\" DevicePath \"\"" Mar 18 16:48:57.884262 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.884167 2536 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-web-config\") on node \"ip-10-0-132-224.ec2.internal\" DevicePath \"\"" Mar 18 16:48:57.884262 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.884176 2536 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-132-224.ec2.internal\" DevicePath \"\"" Mar 18 16:48:57.884262 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.884185 2536 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-132-224.ec2.internal\" DevicePath \"\"" Mar 18 16:48:57.884262 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.884194 2536 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/73894801-3a68-4b42-ab84-cc9a2706cb9a-alertmanager-main-db\") on node \"ip-10-0-132-224.ec2.internal\" DevicePath \"\"" Mar 18 16:48:57.884262 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.884209 2536 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/73894801-3a68-4b42-ab84-cc9a2706cb9a-tls-assets\") on node \"ip-10-0-132-224.ec2.internal\" DevicePath \"\"" Mar 18 16:48:57.884262 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.884219 2536 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/73894801-3a68-4b42-ab84-cc9a2706cb9a-config-out\") on node \"ip-10-0-132-224.ec2.internal\" DevicePath \"\"" Mar 18 16:48:57.884262 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.884227 2536 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-config-volume\") on node \"ip-10-0-132-224.ec2.internal\" DevicePath \"\"" Mar 18 16:48:57.884262 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.884240 2536 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/73894801-3a68-4b42-ab84-cc9a2706cb9a-cluster-tls-config\") on node \"ip-10-0-132-224.ec2.internal\" DevicePath \"\"" Mar 18 16:48:57.884262 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:57.884252 2536 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dbhzk\" (UniqueName: \"kubernetes.io/projected/73894801-3a68-4b42-ab84-cc9a2706cb9a-kube-api-access-dbhzk\") on node \"ip-10-0-132-224.ec2.internal\" DevicePath \"\"" Mar 18 16:48:58.082583 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.082516 2536 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 16:48:58.086022 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.085995 2536 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 16:48:58.117335 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.117301 2536 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 16:48:58.117664 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.117646 2536 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73894801-3a68-4b42-ab84-cc9a2706cb9a" containerName="kube-rbac-proxy-metric" Mar 18 16:48:58.117745 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.117665 2536 state_mem.go:107] "Deleted CPUSet assignment" podUID="73894801-3a68-4b42-ab84-cc9a2706cb9a" containerName="kube-rbac-proxy-metric" Mar 18 16:48:58.117745 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.117683 2536 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73894801-3a68-4b42-ab84-cc9a2706cb9a" containerName="config-reloader" Mar 18 16:48:58.117745 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.117692 2536 state_mem.go:107] "Deleted CPUSet assignment" podUID="73894801-3a68-4b42-ab84-cc9a2706cb9a" containerName="config-reloader" Mar 18 16:48:58.117745 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.117702 2536 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73894801-3a68-4b42-ab84-cc9a2706cb9a" containerName="prom-label-proxy" Mar 18 16:48:58.117745 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.117711 2536 state_mem.go:107] "Deleted CPUSet assignment" podUID="73894801-3a68-4b42-ab84-cc9a2706cb9a" containerName="prom-label-proxy" Mar 18 16:48:58.117745 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.117722 2536 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73894801-3a68-4b42-ab84-cc9a2706cb9a" containerName="kube-rbac-proxy-web" Mar 18 16:48:58.117745 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.117730 2536 state_mem.go:107] "Deleted CPUSet assignment" podUID="73894801-3a68-4b42-ab84-cc9a2706cb9a" containerName="kube-rbac-proxy-web" Mar 18 16:48:58.117745 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.117746 2536 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73894801-3a68-4b42-ab84-cc9a2706cb9a" containerName="alertmanager" Mar 18 16:48:58.118123 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.117754 2536 state_mem.go:107] "Deleted CPUSet assignment" podUID="73894801-3a68-4b42-ab84-cc9a2706cb9a" containerName="alertmanager" Mar 18 16:48:58.118123 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.117762 2536 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73894801-3a68-4b42-ab84-cc9a2706cb9a" containerName="init-config-reloader" Mar 18 16:48:58.118123 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.117771 2536 state_mem.go:107] "Deleted CPUSet assignment" podUID="73894801-3a68-4b42-ab84-cc9a2706cb9a" containerName="init-config-reloader" Mar 18 16:48:58.118123 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.117781 2536 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73894801-3a68-4b42-ab84-cc9a2706cb9a" containerName="kube-rbac-proxy" Mar 18 16:48:58.118123 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.117806 2536 state_mem.go:107] "Deleted CPUSet assignment" podUID="73894801-3a68-4b42-ab84-cc9a2706cb9a" containerName="kube-rbac-proxy" Mar 18 16:48:58.118123 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.117880 2536 memory_manager.go:356] "RemoveStaleState removing state" podUID="73894801-3a68-4b42-ab84-cc9a2706cb9a" containerName="alertmanager" Mar 18 16:48:58.118123 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.117895 2536 memory_manager.go:356] "RemoveStaleState removing state" podUID="73894801-3a68-4b42-ab84-cc9a2706cb9a" containerName="kube-rbac-proxy-metric" Mar 18 16:48:58.118123 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.117907 2536 memory_manager.go:356] "RemoveStaleState removing state" podUID="73894801-3a68-4b42-ab84-cc9a2706cb9a" containerName="kube-rbac-proxy" Mar 18 16:48:58.118123 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.117916 2536 memory_manager.go:356] "RemoveStaleState removing state" podUID="73894801-3a68-4b42-ab84-cc9a2706cb9a" containerName="kube-rbac-proxy-web" Mar 18 16:48:58.118123 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.117925 2536 memory_manager.go:356] "RemoveStaleState removing state" podUID="73894801-3a68-4b42-ab84-cc9a2706cb9a" containerName="config-reloader" Mar 18 16:48:58.118123 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.117935 2536 memory_manager.go:356] "RemoveStaleState removing state" podUID="73894801-3a68-4b42-ab84-cc9a2706cb9a" containerName="prom-label-proxy" Mar 18 16:48:58.126705 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.126684 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.128883 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.128858 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Mar 18 16:48:58.128883 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.128880 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Mar 18 16:48:58.129083 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.128868 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Mar 18 16:48:58.129083 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.128858 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Mar 18 16:48:58.129083 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.129002 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Mar 18 16:48:58.129083 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.129035 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-bdkn5\"" Mar 18 16:48:58.129308 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.129191 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Mar 18 16:48:58.129308 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.129233 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Mar 18 16:48:58.129427 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.129337 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Mar 18 16:48:58.135023 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.135001 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 16:48:58.135132 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.135023 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Mar 18 16:48:58.186930 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.186897 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a130b1b3-5902-463d-ac4c-4f7aaa59cb2f-config-out\") pod \"alertmanager-main-0\" (UID: \"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.187072 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.186933 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/a130b1b3-5902-463d-ac4c-4f7aaa59cb2f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.187072 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.186969 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a130b1b3-5902-463d-ac4c-4f7aaa59cb2f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.187072 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.186998 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/a130b1b3-5902-463d-ac4c-4f7aaa59cb2f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.187072 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.187017 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a130b1b3-5902-463d-ac4c-4f7aaa59cb2f-web-config\") pod \"alertmanager-main-0\" (UID: \"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.187072 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.187059 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a130b1b3-5902-463d-ac4c-4f7aaa59cb2f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.187233 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.187078 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/a130b1b3-5902-463d-ac4c-4f7aaa59cb2f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.187233 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.187104 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a130b1b3-5902-463d-ac4c-4f7aaa59cb2f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.187233 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.187145 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/a130b1b3-5902-463d-ac4c-4f7aaa59cb2f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.187233 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.187163 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpxnj\" (UniqueName: \"kubernetes.io/projected/a130b1b3-5902-463d-ac4c-4f7aaa59cb2f-kube-api-access-lpxnj\") pod \"alertmanager-main-0\" (UID: \"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.187233 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.187182 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a130b1b3-5902-463d-ac4c-4f7aaa59cb2f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.187233 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.187220 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a130b1b3-5902-463d-ac4c-4f7aaa59cb2f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.187442 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.187248 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a130b1b3-5902-463d-ac4c-4f7aaa59cb2f-config-volume\") pod \"alertmanager-main-0\" (UID: \"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.288478 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.288450 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a130b1b3-5902-463d-ac4c-4f7aaa59cb2f-config-out\") pod \"alertmanager-main-0\" (UID: \"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.288631 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.288483 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/a130b1b3-5902-463d-ac4c-4f7aaa59cb2f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.288631 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.288509 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a130b1b3-5902-463d-ac4c-4f7aaa59cb2f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.288631 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.288530 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/a130b1b3-5902-463d-ac4c-4f7aaa59cb2f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.288631 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.288548 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a130b1b3-5902-463d-ac4c-4f7aaa59cb2f-web-config\") pod \"alertmanager-main-0\" (UID: \"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.288631 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.288567 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a130b1b3-5902-463d-ac4c-4f7aaa59cb2f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.288631 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.288617 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/a130b1b3-5902-463d-ac4c-4f7aaa59cb2f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.288933 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.288665 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a130b1b3-5902-463d-ac4c-4f7aaa59cb2f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.288933 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.288699 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/a130b1b3-5902-463d-ac4c-4f7aaa59cb2f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.288933 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.288724 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lpxnj\" (UniqueName: \"kubernetes.io/projected/a130b1b3-5902-463d-ac4c-4f7aaa59cb2f-kube-api-access-lpxnj\") pod \"alertmanager-main-0\" (UID: \"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.288933 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.288755 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a130b1b3-5902-463d-ac4c-4f7aaa59cb2f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.288933 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.288793 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a130b1b3-5902-463d-ac4c-4f7aaa59cb2f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.288933 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.288844 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a130b1b3-5902-463d-ac4c-4f7aaa59cb2f-config-volume\") pod \"alertmanager-main-0\" (UID: \"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.289231 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.289003 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/a130b1b3-5902-463d-ac4c-4f7aaa59cb2f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.289436 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.289413 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a130b1b3-5902-463d-ac4c-4f7aaa59cb2f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.290059 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.290027 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a130b1b3-5902-463d-ac4c-4f7aaa59cb2f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.291812 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.291694 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a130b1b3-5902-463d-ac4c-4f7aaa59cb2f-config-out\") pod \"alertmanager-main-0\" (UID: \"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.291812 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.291730 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a130b1b3-5902-463d-ac4c-4f7aaa59cb2f-web-config\") pod \"alertmanager-main-0\" (UID: \"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.291812 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.291730 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a130b1b3-5902-463d-ac4c-4f7aaa59cb2f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.291812 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.291736 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/a130b1b3-5902-463d-ac4c-4f7aaa59cb2f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.292117 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.292097 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a130b1b3-5902-463d-ac4c-4f7aaa59cb2f-config-volume\") pod \"alertmanager-main-0\" (UID: \"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.292221 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.292194 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/a130b1b3-5902-463d-ac4c-4f7aaa59cb2f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.292344 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.292326 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/a130b1b3-5902-463d-ac4c-4f7aaa59cb2f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.292601 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.292570 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a130b1b3-5902-463d-ac4c-4f7aaa59cb2f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.293175 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.293157 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a130b1b3-5902-463d-ac4c-4f7aaa59cb2f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.297004 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.296988 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpxnj\" (UniqueName: \"kubernetes.io/projected/a130b1b3-5902-463d-ac4c-4f7aaa59cb2f-kube-api-access-lpxnj\") pod \"alertmanager-main-0\" (UID: \"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.438679 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.438578 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:58.564344 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.564318 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 16:48:58.566681 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:48:58.566652 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda130b1b3_5902_463d_ac4c_4f7aaa59cb2f.slice/crio-d5b239ab74eadc0da947d25e46f911e415d8b34dda3be6a3e60935ee5f59b661 WatchSource:0}: Error finding container d5b239ab74eadc0da947d25e46f911e415d8b34dda3be6a3e60935ee5f59b661: Status 404 returned error can't find the container with id d5b239ab74eadc0da947d25e46f911e415d8b34dda3be6a3e60935ee5f59b661 Mar 18 16:48:58.769312 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.769286 2536 generic.go:358] "Generic (PLEG): container finished" podID="a130b1b3-5902-463d-ac4c-4f7aaa59cb2f" containerID="db4ad5abc00af7578ea523cedd8d9b943e80139b68d766bd34f159b2103cae3d" exitCode=0 Mar 18 16:48:58.769639 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.769367 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f","Type":"ContainerDied","Data":"db4ad5abc00af7578ea523cedd8d9b943e80139b68d766bd34f159b2103cae3d"} Mar 18 16:48:58.769639 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:58.769411 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f","Type":"ContainerStarted","Data":"d5b239ab74eadc0da947d25e46f911e415d8b34dda3be6a3e60935ee5f59b661"} Mar 18 16:48:59.775853 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:59.775822 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f","Type":"ContainerStarted","Data":"e050a780173b19cf2d3ffcac49e63256979611ed4491e932c48c06b02f350246"} Mar 18 16:48:59.775853 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:59.775856 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f","Type":"ContainerStarted","Data":"7d9ca5422b7c4fb66ad3645f2712d4f96b909ff1ee6918c1b01dc0660a5770d4"} Mar 18 16:48:59.775853 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:59.775866 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f","Type":"ContainerStarted","Data":"324947c1c69c09b5914a63322a007a5be86fdc451652b37d891199a522ccad17"} Mar 18 16:48:59.776333 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:59.775878 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f","Type":"ContainerStarted","Data":"848e3d721dfdd9a161519133fffdbc0c84061315fa80be1864fc6d8e5fe95525"} Mar 18 16:48:59.776333 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:59.775887 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f","Type":"ContainerStarted","Data":"8f3039307c641e723ea3f66f27a1dc9cbc03942aff239735a9a0b59c65343d48"} Mar 18 16:48:59.776333 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:59.775894 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a130b1b3-5902-463d-ac4c-4f7aaa59cb2f","Type":"ContainerStarted","Data":"837551efd804c1ccdc41c59e2e2454759347477ce33c1b9f9e05609883258bad"} Mar 18 16:48:59.802256 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:59.802204 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.8021891399999999 podStartE2EDuration="1.80218914s" podCreationTimestamp="2026-03-18 16:48:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:48:59.800150775 +0000 UTC m=+262.573725362" watchObservedRunningTime="2026-03-18 16:48:59.80218914 +0000 UTC m=+262.575763726" Mar 18 16:48:59.909922 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:48:59.909890 2536 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73894801-3a68-4b42-ab84-cc9a2706cb9a" path="/var/lib/kubelet/pods/73894801-3a68-4b42-ab84-cc9a2706cb9a/volumes" Mar 18 16:49:00.824023 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:00.823994 2536 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 16:49:00.824476 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:00.824443 2536 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="ec496183-6a46-48d9-9cd3-79969e0f09a5" containerName="prometheus" containerID="cri-o://e88e3863249bf414fa9d4d3a62ab8c792d4da149b3408350453212f59e45b004" gracePeriod=600 Mar 18 16:49:00.824633 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:00.824481 2536 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="ec496183-6a46-48d9-9cd3-79969e0f09a5" containerName="kube-rbac-proxy" containerID="cri-o://68ba1d01581acec4ce60fdfef6c83a79c8cc12070e86c7b1dff3fbf9f9699fcc" gracePeriod=600 Mar 18 16:49:00.824633 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:00.824494 2536 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="ec496183-6a46-48d9-9cd3-79969e0f09a5" containerName="thanos-sidecar" containerID="cri-o://0866c607fbe8d1fb480a9dbe3a0aca31fe956a926a4ccfe67210cafe8188b60a" gracePeriod=600 Mar 18 16:49:00.824633 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:00.824504 2536 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="ec496183-6a46-48d9-9cd3-79969e0f09a5" containerName="config-reloader" containerID="cri-o://bc8804455305327453ae7a4611ccbfac4205b6416e35786217b763457495f68d" gracePeriod=600 Mar 18 16:49:00.824633 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:00.824575 2536 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="ec496183-6a46-48d9-9cd3-79969e0f09a5" containerName="kube-rbac-proxy-web" containerID="cri-o://66e77476d47e8a3093be7bdaf14b45a2cfcbb87c85889307c75534c9f312f23f" gracePeriod=600 Mar 18 16:49:00.824633 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:00.824589 2536 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="ec496183-6a46-48d9-9cd3-79969e0f09a5" containerName="kube-rbac-proxy-thanos" containerID="cri-o://84cbac1d887142ae76b0efe704614130870d0bfb7cfb3979367e24203a6cbe97" gracePeriod=600 Mar 18 16:49:01.065710 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.065689 2536 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:01.211478 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.211445 2536 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ec496183-6a46-48d9-9cd3-79969e0f09a5-configmap-metrics-client-ca\") pod \"ec496183-6a46-48d9-9cd3-79969e0f09a5\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " Mar 18 16:49:01.211643 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.211492 2536 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-secret-grpc-tls\") pod \"ec496183-6a46-48d9-9cd3-79969e0f09a5\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " Mar 18 16:49:01.211643 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.211519 2536 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec496183-6a46-48d9-9cd3-79969e0f09a5-configmap-serving-certs-ca-bundle\") pod \"ec496183-6a46-48d9-9cd3-79969e0f09a5\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " Mar 18 16:49:01.211643 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.211542 2536 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ec496183-6a46-48d9-9cd3-79969e0f09a5-tls-assets\") pod \"ec496183-6a46-48d9-9cd3-79969e0f09a5\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " Mar 18 16:49:01.211643 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.211566 2536 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec496183-6a46-48d9-9cd3-79969e0f09a5-configmap-kubelet-serving-ca-bundle\") pod \"ec496183-6a46-48d9-9cd3-79969e0f09a5\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " Mar 18 16:49:01.211643 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.211592 2536 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-secret-metrics-client-certs\") pod \"ec496183-6a46-48d9-9cd3-79969e0f09a5\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " Mar 18 16:49:01.211643 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.211614 2536 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"ec496183-6a46-48d9-9cd3-79969e0f09a5\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " Mar 18 16:49:01.211643 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.211637 2536 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-web-config\") pod \"ec496183-6a46-48d9-9cd3-79969e0f09a5\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " Mar 18 16:49:01.211973 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.211691 2536 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-thanos-prometheus-http-client-file\") pod \"ec496183-6a46-48d9-9cd3-79969e0f09a5\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " Mar 18 16:49:01.211973 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.211724 2536 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ec496183-6a46-48d9-9cd3-79969e0f09a5-prometheus-k8s-rulefiles-0\") pod \"ec496183-6a46-48d9-9cd3-79969e0f09a5\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " Mar 18 16:49:01.211973 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.211748 2536 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-config\") pod \"ec496183-6a46-48d9-9cd3-79969e0f09a5\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " Mar 18 16:49:01.211973 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.211785 2536 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ec496183-6a46-48d9-9cd3-79969e0f09a5-config-out\") pod \"ec496183-6a46-48d9-9cd3-79969e0f09a5\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " Mar 18 16:49:01.211973 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.211825 2536 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-secret-kube-rbac-proxy\") pod \"ec496183-6a46-48d9-9cd3-79969e0f09a5\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " Mar 18 16:49:01.211973 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.211859 2536 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"ec496183-6a46-48d9-9cd3-79969e0f09a5\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " Mar 18 16:49:01.211973 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.211893 2536 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec496183-6a46-48d9-9cd3-79969e0f09a5-prometheus-trusted-ca-bundle\") pod \"ec496183-6a46-48d9-9cd3-79969e0f09a5\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " Mar 18 16:49:01.211973 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.211917 2536 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghwcx\" (UniqueName: \"kubernetes.io/projected/ec496183-6a46-48d9-9cd3-79969e0f09a5-kube-api-access-ghwcx\") pod \"ec496183-6a46-48d9-9cd3-79969e0f09a5\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " Mar 18 16:49:01.211973 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.211950 2536 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ec496183-6a46-48d9-9cd3-79969e0f09a5-prometheus-k8s-db\") pod \"ec496183-6a46-48d9-9cd3-79969e0f09a5\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " Mar 18 16:49:01.212419 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.211993 2536 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec496183-6a46-48d9-9cd3-79969e0f09a5-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "ec496183-6a46-48d9-9cd3-79969e0f09a5" (UID: "ec496183-6a46-48d9-9cd3-79969e0f09a5"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:49:01.212419 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.212021 2536 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-secret-prometheus-k8s-tls\") pod \"ec496183-6a46-48d9-9cd3-79969e0f09a5\" (UID: \"ec496183-6a46-48d9-9cd3-79969e0f09a5\") " Mar 18 16:49:01.212419 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.212122 2536 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec496183-6a46-48d9-9cd3-79969e0f09a5-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "ec496183-6a46-48d9-9cd3-79969e0f09a5" (UID: "ec496183-6a46-48d9-9cd3-79969e0f09a5"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:49:01.212419 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.212200 2536 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec496183-6a46-48d9-9cd3-79969e0f09a5-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "ec496183-6a46-48d9-9cd3-79969e0f09a5" (UID: "ec496183-6a46-48d9-9cd3-79969e0f09a5"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:49:01.212419 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.212334 2536 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ec496183-6a46-48d9-9cd3-79969e0f09a5-configmap-metrics-client-ca\") on node \"ip-10-0-132-224.ec2.internal\" DevicePath \"\"" Mar 18 16:49:01.212419 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.212357 2536 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec496183-6a46-48d9-9cd3-79969e0f09a5-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-132-224.ec2.internal\" DevicePath \"\"" Mar 18 16:49:01.212419 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.212375 2536 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec496183-6a46-48d9-9cd3-79969e0f09a5-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-132-224.ec2.internal\" DevicePath \"\"" Mar 18 16:49:01.214776 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.213807 2536 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec496183-6a46-48d9-9cd3-79969e0f09a5-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "ec496183-6a46-48d9-9cd3-79969e0f09a5" (UID: "ec496183-6a46-48d9-9cd3-79969e0f09a5"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:49:01.214776 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.213940 2536 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec496183-6a46-48d9-9cd3-79969e0f09a5-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "ec496183-6a46-48d9-9cd3-79969e0f09a5" (UID: "ec496183-6a46-48d9-9cd3-79969e0f09a5"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:49:01.214776 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.214734 2536 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec496183-6a46-48d9-9cd3-79969e0f09a5-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "ec496183-6a46-48d9-9cd3-79969e0f09a5" (UID: "ec496183-6a46-48d9-9cd3-79969e0f09a5"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:49:01.215539 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.215511 2536 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "ec496183-6a46-48d9-9cd3-79969e0f09a5" (UID: "ec496183-6a46-48d9-9cd3-79969e0f09a5"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:49:01.215747 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.215716 2536 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "ec496183-6a46-48d9-9cd3-79969e0f09a5" (UID: "ec496183-6a46-48d9-9cd3-79969e0f09a5"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:49:01.215847 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.215826 2536 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "ec496183-6a46-48d9-9cd3-79969e0f09a5" (UID: "ec496183-6a46-48d9-9cd3-79969e0f09a5"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:49:01.215926 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.215907 2536 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "ec496183-6a46-48d9-9cd3-79969e0f09a5" (UID: "ec496183-6a46-48d9-9cd3-79969e0f09a5"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:49:01.216019 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.215936 2536 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "ec496183-6a46-48d9-9cd3-79969e0f09a5" (UID: "ec496183-6a46-48d9-9cd3-79969e0f09a5"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:49:01.216097 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.216031 2536 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec496183-6a46-48d9-9cd3-79969e0f09a5-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "ec496183-6a46-48d9-9cd3-79969e0f09a5" (UID: "ec496183-6a46-48d9-9cd3-79969e0f09a5"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:49:01.216097 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.216059 2536 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec496183-6a46-48d9-9cd3-79969e0f09a5-kube-api-access-ghwcx" (OuterVolumeSpecName: "kube-api-access-ghwcx") pod "ec496183-6a46-48d9-9cd3-79969e0f09a5" (UID: "ec496183-6a46-48d9-9cd3-79969e0f09a5"). InnerVolumeSpecName "kube-api-access-ghwcx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:49:01.216239 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.216219 2536 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "ec496183-6a46-48d9-9cd3-79969e0f09a5" (UID: "ec496183-6a46-48d9-9cd3-79969e0f09a5"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:49:01.216459 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.216441 2536 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "ec496183-6a46-48d9-9cd3-79969e0f09a5" (UID: "ec496183-6a46-48d9-9cd3-79969e0f09a5"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:49:01.216653 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.216637 2536 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-config" (OuterVolumeSpecName: "config") pod "ec496183-6a46-48d9-9cd3-79969e0f09a5" (UID: "ec496183-6a46-48d9-9cd3-79969e0f09a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:49:01.217787 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.217771 2536 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec496183-6a46-48d9-9cd3-79969e0f09a5-config-out" (OuterVolumeSpecName: "config-out") pod "ec496183-6a46-48d9-9cd3-79969e0f09a5" (UID: "ec496183-6a46-48d9-9cd3-79969e0f09a5"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:49:01.227555 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.227534 2536 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-web-config" (OuterVolumeSpecName: "web-config") pod "ec496183-6a46-48d9-9cd3-79969e0f09a5" (UID: "ec496183-6a46-48d9-9cd3-79969e0f09a5"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:49:01.313245 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.313208 2536 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-secret-grpc-tls\") on node \"ip-10-0-132-224.ec2.internal\" DevicePath \"\"" Mar 18 16:49:01.313245 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.313236 2536 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ec496183-6a46-48d9-9cd3-79969e0f09a5-tls-assets\") on node \"ip-10-0-132-224.ec2.internal\" DevicePath \"\"" Mar 18 16:49:01.313245 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.313246 2536 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-secret-metrics-client-certs\") on node \"ip-10-0-132-224.ec2.internal\" DevicePath \"\"" Mar 18 16:49:01.313474 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.313257 2536 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-132-224.ec2.internal\" DevicePath \"\"" Mar 18 16:49:01.313474 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.313283 2536 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-web-config\") on node \"ip-10-0-132-224.ec2.internal\" DevicePath \"\"" Mar 18 16:49:01.313474 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.313293 2536 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-thanos-prometheus-http-client-file\") on node \"ip-10-0-132-224.ec2.internal\" DevicePath \"\"" Mar 18 16:49:01.313474 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.313302 2536 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ec496183-6a46-48d9-9cd3-79969e0f09a5-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-132-224.ec2.internal\" DevicePath \"\"" Mar 18 16:49:01.313474 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.313313 2536 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-config\") on node \"ip-10-0-132-224.ec2.internal\" DevicePath \"\"" Mar 18 16:49:01.313474 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.313321 2536 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ec496183-6a46-48d9-9cd3-79969e0f09a5-config-out\") on node \"ip-10-0-132-224.ec2.internal\" DevicePath \"\"" Mar 18 16:49:01.313474 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.313329 2536 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-secret-kube-rbac-proxy\") on node \"ip-10-0-132-224.ec2.internal\" DevicePath \"\"" Mar 18 16:49:01.313474 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.313338 2536 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-132-224.ec2.internal\" DevicePath \"\"" Mar 18 16:49:01.313474 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.313347 2536 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec496183-6a46-48d9-9cd3-79969e0f09a5-prometheus-trusted-ca-bundle\") on node \"ip-10-0-132-224.ec2.internal\" DevicePath \"\"" Mar 18 16:49:01.313474 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.313356 2536 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ghwcx\" (UniqueName: \"kubernetes.io/projected/ec496183-6a46-48d9-9cd3-79969e0f09a5-kube-api-access-ghwcx\") on node \"ip-10-0-132-224.ec2.internal\" DevicePath \"\"" Mar 18 16:49:01.313474 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.313365 2536 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ec496183-6a46-48d9-9cd3-79969e0f09a5-prometheus-k8s-db\") on node \"ip-10-0-132-224.ec2.internal\" DevicePath \"\"" Mar 18 16:49:01.313474 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.313374 2536 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ec496183-6a46-48d9-9cd3-79969e0f09a5-secret-prometheus-k8s-tls\") on node \"ip-10-0-132-224.ec2.internal\" DevicePath \"\"" Mar 18 16:49:01.785082 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.785046 2536 generic.go:358] "Generic (PLEG): container finished" podID="ec496183-6a46-48d9-9cd3-79969e0f09a5" containerID="84cbac1d887142ae76b0efe704614130870d0bfb7cfb3979367e24203a6cbe97" exitCode=0 Mar 18 16:49:01.785082 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.785073 2536 generic.go:358] "Generic (PLEG): container finished" podID="ec496183-6a46-48d9-9cd3-79969e0f09a5" containerID="68ba1d01581acec4ce60fdfef6c83a79c8cc12070e86c7b1dff3fbf9f9699fcc" exitCode=0 Mar 18 16:49:01.785082 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.785082 2536 generic.go:358] "Generic (PLEG): container finished" podID="ec496183-6a46-48d9-9cd3-79969e0f09a5" containerID="66e77476d47e8a3093be7bdaf14b45a2cfcbb87c85889307c75534c9f312f23f" exitCode=0 Mar 18 16:49:01.785082 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.785090 2536 generic.go:358] "Generic (PLEG): container finished" podID="ec496183-6a46-48d9-9cd3-79969e0f09a5" containerID="0866c607fbe8d1fb480a9dbe3a0aca31fe956a926a4ccfe67210cafe8188b60a" exitCode=0 Mar 18 16:49:01.785472 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.785098 2536 generic.go:358] "Generic (PLEG): container finished" podID="ec496183-6a46-48d9-9cd3-79969e0f09a5" containerID="bc8804455305327453ae7a4611ccbfac4205b6416e35786217b763457495f68d" exitCode=0 Mar 18 16:49:01.785472 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.785107 2536 generic.go:358] "Generic (PLEG): container finished" podID="ec496183-6a46-48d9-9cd3-79969e0f09a5" containerID="e88e3863249bf414fa9d4d3a62ab8c792d4da149b3408350453212f59e45b004" exitCode=0 Mar 18 16:49:01.785472 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.785135 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ec496183-6a46-48d9-9cd3-79969e0f09a5","Type":"ContainerDied","Data":"84cbac1d887142ae76b0efe704614130870d0bfb7cfb3979367e24203a6cbe97"} Mar 18 16:49:01.785472 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.785154 2536 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:01.785472 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.785180 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ec496183-6a46-48d9-9cd3-79969e0f09a5","Type":"ContainerDied","Data":"68ba1d01581acec4ce60fdfef6c83a79c8cc12070e86c7b1dff3fbf9f9699fcc"} Mar 18 16:49:01.785472 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.785197 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ec496183-6a46-48d9-9cd3-79969e0f09a5","Type":"ContainerDied","Data":"66e77476d47e8a3093be7bdaf14b45a2cfcbb87c85889307c75534c9f312f23f"} Mar 18 16:49:01.785472 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.785210 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ec496183-6a46-48d9-9cd3-79969e0f09a5","Type":"ContainerDied","Data":"0866c607fbe8d1fb480a9dbe3a0aca31fe956a926a4ccfe67210cafe8188b60a"} Mar 18 16:49:01.785472 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.785224 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ec496183-6a46-48d9-9cd3-79969e0f09a5","Type":"ContainerDied","Data":"bc8804455305327453ae7a4611ccbfac4205b6416e35786217b763457495f68d"} Mar 18 16:49:01.785472 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.785237 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ec496183-6a46-48d9-9cd3-79969e0f09a5","Type":"ContainerDied","Data":"e88e3863249bf414fa9d4d3a62ab8c792d4da149b3408350453212f59e45b004"} Mar 18 16:49:01.785472 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.785250 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ec496183-6a46-48d9-9cd3-79969e0f09a5","Type":"ContainerDied","Data":"abab13052157044e9f89c35679d550d5f26d6f4233f746239f4d75d1ba8c16e1"} Mar 18 16:49:01.785472 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.785245 2536 scope.go:117] "RemoveContainer" containerID="84cbac1d887142ae76b0efe704614130870d0bfb7cfb3979367e24203a6cbe97" Mar 18 16:49:01.793048 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.793027 2536 scope.go:117] "RemoveContainer" containerID="68ba1d01581acec4ce60fdfef6c83a79c8cc12070e86c7b1dff3fbf9f9699fcc" Mar 18 16:49:01.799959 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.799943 2536 scope.go:117] "RemoveContainer" containerID="66e77476d47e8a3093be7bdaf14b45a2cfcbb87c85889307c75534c9f312f23f" Mar 18 16:49:01.808388 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.808364 2536 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 16:49:01.809537 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.809517 2536 scope.go:117] "RemoveContainer" containerID="0866c607fbe8d1fb480a9dbe3a0aca31fe956a926a4ccfe67210cafe8188b60a" Mar 18 16:49:01.811624 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.811595 2536 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 16:49:01.817227 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.817211 2536 scope.go:117] "RemoveContainer" containerID="bc8804455305327453ae7a4611ccbfac4205b6416e35786217b763457495f68d" Mar 18 16:49:01.823603 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.823587 2536 scope.go:117] "RemoveContainer" containerID="e88e3863249bf414fa9d4d3a62ab8c792d4da149b3408350453212f59e45b004" Mar 18 16:49:01.830872 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.830709 2536 scope.go:117] "RemoveContainer" containerID="f35f3e6c310503f1d7efd45005c817bdb81d28100c6ba95893fea781c894f315" Mar 18 16:49:01.836053 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.836034 2536 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 16:49:01.836352 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.836339 2536 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec496183-6a46-48d9-9cd3-79969e0f09a5" containerName="config-reloader" Mar 18 16:49:01.836404 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.836354 2536 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec496183-6a46-48d9-9cd3-79969e0f09a5" containerName="config-reloader" Mar 18 16:49:01.836404 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.836364 2536 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec496183-6a46-48d9-9cd3-79969e0f09a5" containerName="kube-rbac-proxy" Mar 18 16:49:01.836404 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.836370 2536 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec496183-6a46-48d9-9cd3-79969e0f09a5" containerName="kube-rbac-proxy" Mar 18 16:49:01.836404 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.836380 2536 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec496183-6a46-48d9-9cd3-79969e0f09a5" containerName="thanos-sidecar" Mar 18 16:49:01.836404 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.836386 2536 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec496183-6a46-48d9-9cd3-79969e0f09a5" containerName="thanos-sidecar" Mar 18 16:49:01.836404 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.836395 2536 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec496183-6a46-48d9-9cd3-79969e0f09a5" containerName="kube-rbac-proxy-thanos" Mar 18 16:49:01.836404 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.836400 2536 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec496183-6a46-48d9-9cd3-79969e0f09a5" containerName="kube-rbac-proxy-thanos" Mar 18 16:49:01.836620 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.836409 2536 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec496183-6a46-48d9-9cd3-79969e0f09a5" containerName="prometheus" Mar 18 16:49:01.836620 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.836414 2536 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec496183-6a46-48d9-9cd3-79969e0f09a5" containerName="prometheus" Mar 18 16:49:01.836620 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.836419 2536 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec496183-6a46-48d9-9cd3-79969e0f09a5" containerName="kube-rbac-proxy-web" Mar 18 16:49:01.836620 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.836426 2536 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec496183-6a46-48d9-9cd3-79969e0f09a5" containerName="kube-rbac-proxy-web" Mar 18 16:49:01.836620 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.836432 2536 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec496183-6a46-48d9-9cd3-79969e0f09a5" containerName="init-config-reloader" Mar 18 16:49:01.836620 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.836437 2536 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec496183-6a46-48d9-9cd3-79969e0f09a5" containerName="init-config-reloader" Mar 18 16:49:01.836620 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.836480 2536 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec496183-6a46-48d9-9cd3-79969e0f09a5" containerName="thanos-sidecar" Mar 18 16:49:01.836620 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.836488 2536 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec496183-6a46-48d9-9cd3-79969e0f09a5" containerName="config-reloader" Mar 18 16:49:01.836620 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.836494 2536 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec496183-6a46-48d9-9cd3-79969e0f09a5" containerName="prometheus" Mar 18 16:49:01.836620 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.836500 2536 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec496183-6a46-48d9-9cd3-79969e0f09a5" containerName="kube-rbac-proxy" Mar 18 16:49:01.836620 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.836509 2536 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec496183-6a46-48d9-9cd3-79969e0f09a5" containerName="kube-rbac-proxy-thanos" Mar 18 16:49:01.836620 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.836515 2536 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec496183-6a46-48d9-9cd3-79969e0f09a5" containerName="kube-rbac-proxy-web" Mar 18 16:49:01.837857 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.837835 2536 scope.go:117] "RemoveContainer" containerID="84cbac1d887142ae76b0efe704614130870d0bfb7cfb3979367e24203a6cbe97" Mar 18 16:49:01.838101 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:49:01.838085 2536 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84cbac1d887142ae76b0efe704614130870d0bfb7cfb3979367e24203a6cbe97\": container with ID starting with 84cbac1d887142ae76b0efe704614130870d0bfb7cfb3979367e24203a6cbe97 not found: ID does not exist" containerID="84cbac1d887142ae76b0efe704614130870d0bfb7cfb3979367e24203a6cbe97" Mar 18 16:49:01.838157 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.838110 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84cbac1d887142ae76b0efe704614130870d0bfb7cfb3979367e24203a6cbe97"} err="failed to get container status \"84cbac1d887142ae76b0efe704614130870d0bfb7cfb3979367e24203a6cbe97\": rpc error: code = NotFound desc = could not find container \"84cbac1d887142ae76b0efe704614130870d0bfb7cfb3979367e24203a6cbe97\": container with ID starting with 84cbac1d887142ae76b0efe704614130870d0bfb7cfb3979367e24203a6cbe97 not found: ID does not exist" Mar 18 16:49:01.838157 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.838139 2536 scope.go:117] "RemoveContainer" containerID="68ba1d01581acec4ce60fdfef6c83a79c8cc12070e86c7b1dff3fbf9f9699fcc" Mar 18 16:49:01.838368 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:49:01.838349 2536 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68ba1d01581acec4ce60fdfef6c83a79c8cc12070e86c7b1dff3fbf9f9699fcc\": container with ID starting with 68ba1d01581acec4ce60fdfef6c83a79c8cc12070e86c7b1dff3fbf9f9699fcc not found: ID does not exist" containerID="68ba1d01581acec4ce60fdfef6c83a79c8cc12070e86c7b1dff3fbf9f9699fcc" Mar 18 16:49:01.838471 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.838371 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68ba1d01581acec4ce60fdfef6c83a79c8cc12070e86c7b1dff3fbf9f9699fcc"} err="failed to get container status \"68ba1d01581acec4ce60fdfef6c83a79c8cc12070e86c7b1dff3fbf9f9699fcc\": rpc error: code = NotFound desc = could not find container \"68ba1d01581acec4ce60fdfef6c83a79c8cc12070e86c7b1dff3fbf9f9699fcc\": container with ID starting with 68ba1d01581acec4ce60fdfef6c83a79c8cc12070e86c7b1dff3fbf9f9699fcc not found: ID does not exist" Mar 18 16:49:01.838471 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.838386 2536 scope.go:117] "RemoveContainer" containerID="66e77476d47e8a3093be7bdaf14b45a2cfcbb87c85889307c75534c9f312f23f" Mar 18 16:49:01.838700 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:49:01.838676 2536 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66e77476d47e8a3093be7bdaf14b45a2cfcbb87c85889307c75534c9f312f23f\": container with ID starting with 66e77476d47e8a3093be7bdaf14b45a2cfcbb87c85889307c75534c9f312f23f not found: ID does not exist" containerID="66e77476d47e8a3093be7bdaf14b45a2cfcbb87c85889307c75534c9f312f23f" Mar 18 16:49:01.838783 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.838707 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66e77476d47e8a3093be7bdaf14b45a2cfcbb87c85889307c75534c9f312f23f"} err="failed to get container status \"66e77476d47e8a3093be7bdaf14b45a2cfcbb87c85889307c75534c9f312f23f\": rpc error: code = NotFound desc = could not find container \"66e77476d47e8a3093be7bdaf14b45a2cfcbb87c85889307c75534c9f312f23f\": container with ID starting with 66e77476d47e8a3093be7bdaf14b45a2cfcbb87c85889307c75534c9f312f23f not found: ID does not exist" Mar 18 16:49:01.838783 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.838726 2536 scope.go:117] "RemoveContainer" containerID="0866c607fbe8d1fb480a9dbe3a0aca31fe956a926a4ccfe67210cafe8188b60a" Mar 18 16:49:01.838949 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:49:01.838930 2536 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0866c607fbe8d1fb480a9dbe3a0aca31fe956a926a4ccfe67210cafe8188b60a\": container with ID starting with 0866c607fbe8d1fb480a9dbe3a0aca31fe956a926a4ccfe67210cafe8188b60a not found: ID does not exist" containerID="0866c607fbe8d1fb480a9dbe3a0aca31fe956a926a4ccfe67210cafe8188b60a" Mar 18 16:49:01.838992 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.838952 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0866c607fbe8d1fb480a9dbe3a0aca31fe956a926a4ccfe67210cafe8188b60a"} err="failed to get container status \"0866c607fbe8d1fb480a9dbe3a0aca31fe956a926a4ccfe67210cafe8188b60a\": rpc error: code = NotFound desc = could not find container \"0866c607fbe8d1fb480a9dbe3a0aca31fe956a926a4ccfe67210cafe8188b60a\": container with ID starting with 0866c607fbe8d1fb480a9dbe3a0aca31fe956a926a4ccfe67210cafe8188b60a not found: ID does not exist" Mar 18 16:49:01.838992 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.838965 2536 scope.go:117] "RemoveContainer" containerID="bc8804455305327453ae7a4611ccbfac4205b6416e35786217b763457495f68d" Mar 18 16:49:01.839178 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:49:01.839162 2536 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc8804455305327453ae7a4611ccbfac4205b6416e35786217b763457495f68d\": container with ID starting with bc8804455305327453ae7a4611ccbfac4205b6416e35786217b763457495f68d not found: ID does not exist" containerID="bc8804455305327453ae7a4611ccbfac4205b6416e35786217b763457495f68d" Mar 18 16:49:01.839222 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.839183 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc8804455305327453ae7a4611ccbfac4205b6416e35786217b763457495f68d"} err="failed to get container status \"bc8804455305327453ae7a4611ccbfac4205b6416e35786217b763457495f68d\": rpc error: code = NotFound desc = could not find container \"bc8804455305327453ae7a4611ccbfac4205b6416e35786217b763457495f68d\": container with ID starting with bc8804455305327453ae7a4611ccbfac4205b6416e35786217b763457495f68d not found: ID does not exist" Mar 18 16:49:01.839222 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.839199 2536 scope.go:117] "RemoveContainer" containerID="e88e3863249bf414fa9d4d3a62ab8c792d4da149b3408350453212f59e45b004" Mar 18 16:49:01.839458 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:49:01.839443 2536 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e88e3863249bf414fa9d4d3a62ab8c792d4da149b3408350453212f59e45b004\": container with ID starting with e88e3863249bf414fa9d4d3a62ab8c792d4da149b3408350453212f59e45b004 not found: ID does not exist" containerID="e88e3863249bf414fa9d4d3a62ab8c792d4da149b3408350453212f59e45b004" Mar 18 16:49:01.839509 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.839462 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e88e3863249bf414fa9d4d3a62ab8c792d4da149b3408350453212f59e45b004"} err="failed to get container status \"e88e3863249bf414fa9d4d3a62ab8c792d4da149b3408350453212f59e45b004\": rpc error: code = NotFound desc = could not find container \"e88e3863249bf414fa9d4d3a62ab8c792d4da149b3408350453212f59e45b004\": container with ID starting with e88e3863249bf414fa9d4d3a62ab8c792d4da149b3408350453212f59e45b004 not found: ID does not exist" Mar 18 16:49:01.839509 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.839475 2536 scope.go:117] "RemoveContainer" containerID="f35f3e6c310503f1d7efd45005c817bdb81d28100c6ba95893fea781c894f315" Mar 18 16:49:01.839699 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:49:01.839683 2536 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f35f3e6c310503f1d7efd45005c817bdb81d28100c6ba95893fea781c894f315\": container with ID starting with f35f3e6c310503f1d7efd45005c817bdb81d28100c6ba95893fea781c894f315 not found: ID does not exist" containerID="f35f3e6c310503f1d7efd45005c817bdb81d28100c6ba95893fea781c894f315" Mar 18 16:49:01.839741 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.839703 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f35f3e6c310503f1d7efd45005c817bdb81d28100c6ba95893fea781c894f315"} err="failed to get container status \"f35f3e6c310503f1d7efd45005c817bdb81d28100c6ba95893fea781c894f315\": rpc error: code = NotFound desc = could not find container \"f35f3e6c310503f1d7efd45005c817bdb81d28100c6ba95893fea781c894f315\": container with ID starting with f35f3e6c310503f1d7efd45005c817bdb81d28100c6ba95893fea781c894f315 not found: ID does not exist" Mar 18 16:49:01.839741 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.839717 2536 scope.go:117] "RemoveContainer" containerID="84cbac1d887142ae76b0efe704614130870d0bfb7cfb3979367e24203a6cbe97" Mar 18 16:49:01.839931 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.839912 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84cbac1d887142ae76b0efe704614130870d0bfb7cfb3979367e24203a6cbe97"} err="failed to get container status \"84cbac1d887142ae76b0efe704614130870d0bfb7cfb3979367e24203a6cbe97\": rpc error: code = NotFound desc = could not find container \"84cbac1d887142ae76b0efe704614130870d0bfb7cfb3979367e24203a6cbe97\": container with ID starting with 84cbac1d887142ae76b0efe704614130870d0bfb7cfb3979367e24203a6cbe97 not found: ID does not exist" Mar 18 16:49:01.839999 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.839933 2536 scope.go:117] "RemoveContainer" containerID="68ba1d01581acec4ce60fdfef6c83a79c8cc12070e86c7b1dff3fbf9f9699fcc" Mar 18 16:49:01.840168 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.840152 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68ba1d01581acec4ce60fdfef6c83a79c8cc12070e86c7b1dff3fbf9f9699fcc"} err="failed to get container status \"68ba1d01581acec4ce60fdfef6c83a79c8cc12070e86c7b1dff3fbf9f9699fcc\": rpc error: code = NotFound desc = could not find container \"68ba1d01581acec4ce60fdfef6c83a79c8cc12070e86c7b1dff3fbf9f9699fcc\": container with ID starting with 68ba1d01581acec4ce60fdfef6c83a79c8cc12070e86c7b1dff3fbf9f9699fcc not found: ID does not exist" Mar 18 16:49:01.840243 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.840170 2536 scope.go:117] "RemoveContainer" containerID="66e77476d47e8a3093be7bdaf14b45a2cfcbb87c85889307c75534c9f312f23f" Mar 18 16:49:01.840403 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.840384 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66e77476d47e8a3093be7bdaf14b45a2cfcbb87c85889307c75534c9f312f23f"} err="failed to get container status \"66e77476d47e8a3093be7bdaf14b45a2cfcbb87c85889307c75534c9f312f23f\": rpc error: code = NotFound desc = could not find container \"66e77476d47e8a3093be7bdaf14b45a2cfcbb87c85889307c75534c9f312f23f\": container with ID starting with 66e77476d47e8a3093be7bdaf14b45a2cfcbb87c85889307c75534c9f312f23f not found: ID does not exist" Mar 18 16:49:01.840457 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.840404 2536 scope.go:117] "RemoveContainer" containerID="0866c607fbe8d1fb480a9dbe3a0aca31fe956a926a4ccfe67210cafe8188b60a" Mar 18 16:49:01.840581 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.840563 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0866c607fbe8d1fb480a9dbe3a0aca31fe956a926a4ccfe67210cafe8188b60a"} err="failed to get container status \"0866c607fbe8d1fb480a9dbe3a0aca31fe956a926a4ccfe67210cafe8188b60a\": rpc error: code = NotFound desc = could not find container \"0866c607fbe8d1fb480a9dbe3a0aca31fe956a926a4ccfe67210cafe8188b60a\": container with ID starting with 0866c607fbe8d1fb480a9dbe3a0aca31fe956a926a4ccfe67210cafe8188b60a not found: ID does not exist" Mar 18 16:49:01.840643 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.840583 2536 scope.go:117] "RemoveContainer" containerID="bc8804455305327453ae7a4611ccbfac4205b6416e35786217b763457495f68d" Mar 18 16:49:01.840795 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.840779 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc8804455305327453ae7a4611ccbfac4205b6416e35786217b763457495f68d"} err="failed to get container status \"bc8804455305327453ae7a4611ccbfac4205b6416e35786217b763457495f68d\": rpc error: code = NotFound desc = could not find container \"bc8804455305327453ae7a4611ccbfac4205b6416e35786217b763457495f68d\": container with ID starting with bc8804455305327453ae7a4611ccbfac4205b6416e35786217b763457495f68d not found: ID does not exist" Mar 18 16:49:01.840840 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.840796 2536 scope.go:117] "RemoveContainer" containerID="e88e3863249bf414fa9d4d3a62ab8c792d4da149b3408350453212f59e45b004" Mar 18 16:49:01.840976 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.840959 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e88e3863249bf414fa9d4d3a62ab8c792d4da149b3408350453212f59e45b004"} err="failed to get container status \"e88e3863249bf414fa9d4d3a62ab8c792d4da149b3408350453212f59e45b004\": rpc error: code = NotFound desc = could not find container \"e88e3863249bf414fa9d4d3a62ab8c792d4da149b3408350453212f59e45b004\": container with ID starting with e88e3863249bf414fa9d4d3a62ab8c792d4da149b3408350453212f59e45b004 not found: ID does not exist" Mar 18 16:49:01.841041 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.840977 2536 scope.go:117] "RemoveContainer" containerID="f35f3e6c310503f1d7efd45005c817bdb81d28100c6ba95893fea781c894f315" Mar 18 16:49:01.841195 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.841174 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f35f3e6c310503f1d7efd45005c817bdb81d28100c6ba95893fea781c894f315"} err="failed to get container status \"f35f3e6c310503f1d7efd45005c817bdb81d28100c6ba95893fea781c894f315\": rpc error: code = NotFound desc = could not find container \"f35f3e6c310503f1d7efd45005c817bdb81d28100c6ba95893fea781c894f315\": container with ID starting with f35f3e6c310503f1d7efd45005c817bdb81d28100c6ba95893fea781c894f315 not found: ID does not exist" Mar 18 16:49:01.841233 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.841198 2536 scope.go:117] "RemoveContainer" containerID="84cbac1d887142ae76b0efe704614130870d0bfb7cfb3979367e24203a6cbe97" Mar 18 16:49:01.841436 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.841416 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84cbac1d887142ae76b0efe704614130870d0bfb7cfb3979367e24203a6cbe97"} err="failed to get container status \"84cbac1d887142ae76b0efe704614130870d0bfb7cfb3979367e24203a6cbe97\": rpc error: code = NotFound desc = could not find container \"84cbac1d887142ae76b0efe704614130870d0bfb7cfb3979367e24203a6cbe97\": container with ID starting with 84cbac1d887142ae76b0efe704614130870d0bfb7cfb3979367e24203a6cbe97 not found: ID does not exist" Mar 18 16:49:01.841436 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.841434 2536 scope.go:117] "RemoveContainer" containerID="68ba1d01581acec4ce60fdfef6c83a79c8cc12070e86c7b1dff3fbf9f9699fcc" Mar 18 16:49:01.841578 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.841563 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:01.841627 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.841611 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68ba1d01581acec4ce60fdfef6c83a79c8cc12070e86c7b1dff3fbf9f9699fcc"} err="failed to get container status \"68ba1d01581acec4ce60fdfef6c83a79c8cc12070e86c7b1dff3fbf9f9699fcc\": rpc error: code = NotFound desc = could not find container \"68ba1d01581acec4ce60fdfef6c83a79c8cc12070e86c7b1dff3fbf9f9699fcc\": container with ID starting with 68ba1d01581acec4ce60fdfef6c83a79c8cc12070e86c7b1dff3fbf9f9699fcc not found: ID does not exist" Mar 18 16:49:01.841666 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.841629 2536 scope.go:117] "RemoveContainer" containerID="66e77476d47e8a3093be7bdaf14b45a2cfcbb87c85889307c75534c9f312f23f" Mar 18 16:49:01.841833 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.841811 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66e77476d47e8a3093be7bdaf14b45a2cfcbb87c85889307c75534c9f312f23f"} err="failed to get container status \"66e77476d47e8a3093be7bdaf14b45a2cfcbb87c85889307c75534c9f312f23f\": rpc error: code = NotFound desc = could not find container \"66e77476d47e8a3093be7bdaf14b45a2cfcbb87c85889307c75534c9f312f23f\": container with ID starting with 66e77476d47e8a3093be7bdaf14b45a2cfcbb87c85889307c75534c9f312f23f not found: ID does not exist" Mar 18 16:49:01.841925 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.841837 2536 scope.go:117] "RemoveContainer" containerID="0866c607fbe8d1fb480a9dbe3a0aca31fe956a926a4ccfe67210cafe8188b60a" Mar 18 16:49:01.842057 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.842031 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0866c607fbe8d1fb480a9dbe3a0aca31fe956a926a4ccfe67210cafe8188b60a"} err="failed to get container status \"0866c607fbe8d1fb480a9dbe3a0aca31fe956a926a4ccfe67210cafe8188b60a\": rpc error: code = NotFound desc = could not find container \"0866c607fbe8d1fb480a9dbe3a0aca31fe956a926a4ccfe67210cafe8188b60a\": container with ID starting with 0866c607fbe8d1fb480a9dbe3a0aca31fe956a926a4ccfe67210cafe8188b60a not found: ID does not exist" Mar 18 16:49:01.842149 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.842058 2536 scope.go:117] "RemoveContainer" containerID="bc8804455305327453ae7a4611ccbfac4205b6416e35786217b763457495f68d" Mar 18 16:49:01.842327 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.842304 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc8804455305327453ae7a4611ccbfac4205b6416e35786217b763457495f68d"} err="failed to get container status \"bc8804455305327453ae7a4611ccbfac4205b6416e35786217b763457495f68d\": rpc error: code = NotFound desc = could not find container \"bc8804455305327453ae7a4611ccbfac4205b6416e35786217b763457495f68d\": container with ID starting with bc8804455305327453ae7a4611ccbfac4205b6416e35786217b763457495f68d not found: ID does not exist" Mar 18 16:49:01.842383 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.842330 2536 scope.go:117] "RemoveContainer" containerID="e88e3863249bf414fa9d4d3a62ab8c792d4da149b3408350453212f59e45b004" Mar 18 16:49:01.842642 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.842623 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e88e3863249bf414fa9d4d3a62ab8c792d4da149b3408350453212f59e45b004"} err="failed to get container status \"e88e3863249bf414fa9d4d3a62ab8c792d4da149b3408350453212f59e45b004\": rpc error: code = NotFound desc = could not find container \"e88e3863249bf414fa9d4d3a62ab8c792d4da149b3408350453212f59e45b004\": container with ID starting with e88e3863249bf414fa9d4d3a62ab8c792d4da149b3408350453212f59e45b004 not found: ID does not exist" Mar 18 16:49:01.842685 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.842643 2536 scope.go:117] "RemoveContainer" containerID="f35f3e6c310503f1d7efd45005c817bdb81d28100c6ba95893fea781c894f315" Mar 18 16:49:01.842900 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.842878 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f35f3e6c310503f1d7efd45005c817bdb81d28100c6ba95893fea781c894f315"} err="failed to get container status \"f35f3e6c310503f1d7efd45005c817bdb81d28100c6ba95893fea781c894f315\": rpc error: code = NotFound desc = could not find container \"f35f3e6c310503f1d7efd45005c817bdb81d28100c6ba95893fea781c894f315\": container with ID starting with f35f3e6c310503f1d7efd45005c817bdb81d28100c6ba95893fea781c894f315 not found: ID does not exist" Mar 18 16:49:01.842952 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.842901 2536 scope.go:117] "RemoveContainer" containerID="84cbac1d887142ae76b0efe704614130870d0bfb7cfb3979367e24203a6cbe97" Mar 18 16:49:01.843134 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.843114 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84cbac1d887142ae76b0efe704614130870d0bfb7cfb3979367e24203a6cbe97"} err="failed to get container status \"84cbac1d887142ae76b0efe704614130870d0bfb7cfb3979367e24203a6cbe97\": rpc error: code = NotFound desc = could not find container \"84cbac1d887142ae76b0efe704614130870d0bfb7cfb3979367e24203a6cbe97\": container with ID starting with 84cbac1d887142ae76b0efe704614130870d0bfb7cfb3979367e24203a6cbe97 not found: ID does not exist" Mar 18 16:49:01.843134 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.843133 2536 scope.go:117] "RemoveContainer" containerID="68ba1d01581acec4ce60fdfef6c83a79c8cc12070e86c7b1dff3fbf9f9699fcc" Mar 18 16:49:01.843401 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.843377 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68ba1d01581acec4ce60fdfef6c83a79c8cc12070e86c7b1dff3fbf9f9699fcc"} err="failed to get container status \"68ba1d01581acec4ce60fdfef6c83a79c8cc12070e86c7b1dff3fbf9f9699fcc\": rpc error: code = NotFound desc = could not find container \"68ba1d01581acec4ce60fdfef6c83a79c8cc12070e86c7b1dff3fbf9f9699fcc\": container with ID starting with 68ba1d01581acec4ce60fdfef6c83a79c8cc12070e86c7b1dff3fbf9f9699fcc not found: ID does not exist" Mar 18 16:49:01.843472 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.843403 2536 scope.go:117] "RemoveContainer" containerID="66e77476d47e8a3093be7bdaf14b45a2cfcbb87c85889307c75534c9f312f23f" Mar 18 16:49:01.843537 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.843494 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Mar 18 16:49:01.843673 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.843645 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66e77476d47e8a3093be7bdaf14b45a2cfcbb87c85889307c75534c9f312f23f"} err="failed to get container status \"66e77476d47e8a3093be7bdaf14b45a2cfcbb87c85889307c75534c9f312f23f\": rpc error: code = NotFound desc = could not find container \"66e77476d47e8a3093be7bdaf14b45a2cfcbb87c85889307c75534c9f312f23f\": container with ID starting with 66e77476d47e8a3093be7bdaf14b45a2cfcbb87c85889307c75534c9f312f23f not found: ID does not exist" Mar 18 16:49:01.843751 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.843672 2536 scope.go:117] "RemoveContainer" containerID="0866c607fbe8d1fb480a9dbe3a0aca31fe956a926a4ccfe67210cafe8188b60a" Mar 18 16:49:01.843751 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.843723 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-b9qsevsatj8e6\"" Mar 18 16:49:01.843751 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.843735 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Mar 18 16:49:01.843896 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.843778 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Mar 18 16:49:01.843896 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.843783 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-tthfc\"" Mar 18 16:49:01.843896 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.843840 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Mar 18 16:49:01.844131 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.844104 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0866c607fbe8d1fb480a9dbe3a0aca31fe956a926a4ccfe67210cafe8188b60a"} err="failed to get container status \"0866c607fbe8d1fb480a9dbe3a0aca31fe956a926a4ccfe67210cafe8188b60a\": rpc error: code = NotFound desc = could not find container \"0866c607fbe8d1fb480a9dbe3a0aca31fe956a926a4ccfe67210cafe8188b60a\": container with ID starting with 0866c607fbe8d1fb480a9dbe3a0aca31fe956a926a4ccfe67210cafe8188b60a not found: ID does not exist" Mar 18 16:49:01.844234 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.844132 2536 scope.go:117] "RemoveContainer" containerID="bc8804455305327453ae7a4611ccbfac4205b6416e35786217b763457495f68d" Mar 18 16:49:01.844348 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.844329 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Mar 18 16:49:01.844451 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.844382 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc8804455305327453ae7a4611ccbfac4205b6416e35786217b763457495f68d"} err="failed to get container status \"bc8804455305327453ae7a4611ccbfac4205b6416e35786217b763457495f68d\": rpc error: code = NotFound desc = could not find container \"bc8804455305327453ae7a4611ccbfac4205b6416e35786217b763457495f68d\": container with ID starting with bc8804455305327453ae7a4611ccbfac4205b6416e35786217b763457495f68d not found: ID does not exist" Mar 18 16:49:01.844451 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.844421 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Mar 18 16:49:01.844451 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.844432 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Mar 18 16:49:01.844628 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.844423 2536 scope.go:117] "RemoveContainer" containerID="e88e3863249bf414fa9d4d3a62ab8c792d4da149b3408350453212f59e45b004" Mar 18 16:49:01.844628 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.844533 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Mar 18 16:49:01.844727 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.844634 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Mar 18 16:49:01.844727 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.844651 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Mar 18 16:49:01.844832 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.844794 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e88e3863249bf414fa9d4d3a62ab8c792d4da149b3408350453212f59e45b004"} err="failed to get container status \"e88e3863249bf414fa9d4d3a62ab8c792d4da149b3408350453212f59e45b004\": rpc error: code = NotFound desc = could not find container \"e88e3863249bf414fa9d4d3a62ab8c792d4da149b3408350453212f59e45b004\": container with ID starting with e88e3863249bf414fa9d4d3a62ab8c792d4da149b3408350453212f59e45b004 not found: ID does not exist" Mar 18 16:49:01.844832 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.844819 2536 scope.go:117] "RemoveContainer" containerID="f35f3e6c310503f1d7efd45005c817bdb81d28100c6ba95893fea781c894f315" Mar 18 16:49:01.844948 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.844924 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Mar 18 16:49:01.845179 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.845155 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f35f3e6c310503f1d7efd45005c817bdb81d28100c6ba95893fea781c894f315"} err="failed to get container status \"f35f3e6c310503f1d7efd45005c817bdb81d28100c6ba95893fea781c894f315\": rpc error: code = NotFound desc = could not find container \"f35f3e6c310503f1d7efd45005c817bdb81d28100c6ba95893fea781c894f315\": container with ID starting with f35f3e6c310503f1d7efd45005c817bdb81d28100c6ba95893fea781c894f315 not found: ID does not exist" Mar 18 16:49:01.845257 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.845181 2536 scope.go:117] "RemoveContainer" containerID="84cbac1d887142ae76b0efe704614130870d0bfb7cfb3979367e24203a6cbe97" Mar 18 16:49:01.845518 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.845493 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84cbac1d887142ae76b0efe704614130870d0bfb7cfb3979367e24203a6cbe97"} err="failed to get container status \"84cbac1d887142ae76b0efe704614130870d0bfb7cfb3979367e24203a6cbe97\": rpc error: code = NotFound desc = could not find container \"84cbac1d887142ae76b0efe704614130870d0bfb7cfb3979367e24203a6cbe97\": container with ID starting with 84cbac1d887142ae76b0efe704614130870d0bfb7cfb3979367e24203a6cbe97 not found: ID does not exist" Mar 18 16:49:01.845621 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.845519 2536 scope.go:117] "RemoveContainer" containerID="68ba1d01581acec4ce60fdfef6c83a79c8cc12070e86c7b1dff3fbf9f9699fcc" Mar 18 16:49:01.845905 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.845880 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68ba1d01581acec4ce60fdfef6c83a79c8cc12070e86c7b1dff3fbf9f9699fcc"} err="failed to get container status \"68ba1d01581acec4ce60fdfef6c83a79c8cc12070e86c7b1dff3fbf9f9699fcc\": rpc error: code = NotFound desc = could not find container \"68ba1d01581acec4ce60fdfef6c83a79c8cc12070e86c7b1dff3fbf9f9699fcc\": container with ID starting with 68ba1d01581acec4ce60fdfef6c83a79c8cc12070e86c7b1dff3fbf9f9699fcc not found: ID does not exist" Mar 18 16:49:01.845998 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.845906 2536 scope.go:117] "RemoveContainer" containerID="66e77476d47e8a3093be7bdaf14b45a2cfcbb87c85889307c75534c9f312f23f" Mar 18 16:49:01.846210 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.846186 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66e77476d47e8a3093be7bdaf14b45a2cfcbb87c85889307c75534c9f312f23f"} err="failed to get container status \"66e77476d47e8a3093be7bdaf14b45a2cfcbb87c85889307c75534c9f312f23f\": rpc error: code = NotFound desc = could not find container \"66e77476d47e8a3093be7bdaf14b45a2cfcbb87c85889307c75534c9f312f23f\": container with ID starting with 66e77476d47e8a3093be7bdaf14b45a2cfcbb87c85889307c75534c9f312f23f not found: ID does not exist" Mar 18 16:49:01.846289 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.846215 2536 scope.go:117] "RemoveContainer" containerID="0866c607fbe8d1fb480a9dbe3a0aca31fe956a926a4ccfe67210cafe8188b60a" Mar 18 16:49:01.846496 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.846474 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0866c607fbe8d1fb480a9dbe3a0aca31fe956a926a4ccfe67210cafe8188b60a"} err="failed to get container status \"0866c607fbe8d1fb480a9dbe3a0aca31fe956a926a4ccfe67210cafe8188b60a\": rpc error: code = NotFound desc = could not find container \"0866c607fbe8d1fb480a9dbe3a0aca31fe956a926a4ccfe67210cafe8188b60a\": container with ID starting with 0866c607fbe8d1fb480a9dbe3a0aca31fe956a926a4ccfe67210cafe8188b60a not found: ID does not exist" Mar 18 16:49:01.846496 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.846496 2536 scope.go:117] "RemoveContainer" containerID="bc8804455305327453ae7a4611ccbfac4205b6416e35786217b763457495f68d" Mar 18 16:49:01.846646 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.846563 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Mar 18 16:49:01.846742 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.846719 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc8804455305327453ae7a4611ccbfac4205b6416e35786217b763457495f68d"} err="failed to get container status \"bc8804455305327453ae7a4611ccbfac4205b6416e35786217b763457495f68d\": rpc error: code = NotFound desc = could not find container \"bc8804455305327453ae7a4611ccbfac4205b6416e35786217b763457495f68d\": container with ID starting with bc8804455305327453ae7a4611ccbfac4205b6416e35786217b763457495f68d not found: ID does not exist" Mar 18 16:49:01.846806 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.846744 2536 scope.go:117] "RemoveContainer" containerID="e88e3863249bf414fa9d4d3a62ab8c792d4da149b3408350453212f59e45b004" Mar 18 16:49:01.846994 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.846976 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e88e3863249bf414fa9d4d3a62ab8c792d4da149b3408350453212f59e45b004"} err="failed to get container status \"e88e3863249bf414fa9d4d3a62ab8c792d4da149b3408350453212f59e45b004\": rpc error: code = NotFound desc = could not find container \"e88e3863249bf414fa9d4d3a62ab8c792d4da149b3408350453212f59e45b004\": container with ID starting with e88e3863249bf414fa9d4d3a62ab8c792d4da149b3408350453212f59e45b004 not found: ID does not exist" Mar 18 16:49:01.847123 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.846995 2536 scope.go:117] "RemoveContainer" containerID="f35f3e6c310503f1d7efd45005c817bdb81d28100c6ba95893fea781c894f315" Mar 18 16:49:01.847232 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.847211 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f35f3e6c310503f1d7efd45005c817bdb81d28100c6ba95893fea781c894f315"} err="failed to get container status \"f35f3e6c310503f1d7efd45005c817bdb81d28100c6ba95893fea781c894f315\": rpc error: code = NotFound desc = could not find container \"f35f3e6c310503f1d7efd45005c817bdb81d28100c6ba95893fea781c894f315\": container with ID starting with f35f3e6c310503f1d7efd45005c817bdb81d28100c6ba95893fea781c894f315 not found: ID does not exist" Mar 18 16:49:01.847317 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.847232 2536 scope.go:117] "RemoveContainer" containerID="84cbac1d887142ae76b0efe704614130870d0bfb7cfb3979367e24203a6cbe97" Mar 18 16:49:01.847529 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.847509 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84cbac1d887142ae76b0efe704614130870d0bfb7cfb3979367e24203a6cbe97"} err="failed to get container status \"84cbac1d887142ae76b0efe704614130870d0bfb7cfb3979367e24203a6cbe97\": rpc error: code = NotFound desc = could not find container \"84cbac1d887142ae76b0efe704614130870d0bfb7cfb3979367e24203a6cbe97\": container with ID starting with 84cbac1d887142ae76b0efe704614130870d0bfb7cfb3979367e24203a6cbe97 not found: ID does not exist" Mar 18 16:49:01.847577 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.847530 2536 scope.go:117] "RemoveContainer" containerID="68ba1d01581acec4ce60fdfef6c83a79c8cc12070e86c7b1dff3fbf9f9699fcc" Mar 18 16:49:01.847773 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.847756 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68ba1d01581acec4ce60fdfef6c83a79c8cc12070e86c7b1dff3fbf9f9699fcc"} err="failed to get container status \"68ba1d01581acec4ce60fdfef6c83a79c8cc12070e86c7b1dff3fbf9f9699fcc\": rpc error: code = NotFound desc = could not find container \"68ba1d01581acec4ce60fdfef6c83a79c8cc12070e86c7b1dff3fbf9f9699fcc\": container with ID starting with 68ba1d01581acec4ce60fdfef6c83a79c8cc12070e86c7b1dff3fbf9f9699fcc not found: ID does not exist" Mar 18 16:49:01.847810 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.847775 2536 scope.go:117] "RemoveContainer" containerID="66e77476d47e8a3093be7bdaf14b45a2cfcbb87c85889307c75534c9f312f23f" Mar 18 16:49:01.848097 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.848072 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66e77476d47e8a3093be7bdaf14b45a2cfcbb87c85889307c75534c9f312f23f"} err="failed to get container status \"66e77476d47e8a3093be7bdaf14b45a2cfcbb87c85889307c75534c9f312f23f\": rpc error: code = NotFound desc = could not find container \"66e77476d47e8a3093be7bdaf14b45a2cfcbb87c85889307c75534c9f312f23f\": container with ID starting with 66e77476d47e8a3093be7bdaf14b45a2cfcbb87c85889307c75534c9f312f23f not found: ID does not exist" Mar 18 16:49:01.848097 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.848096 2536 scope.go:117] "RemoveContainer" containerID="0866c607fbe8d1fb480a9dbe3a0aca31fe956a926a4ccfe67210cafe8188b60a" Mar 18 16:49:01.848385 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.848359 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0866c607fbe8d1fb480a9dbe3a0aca31fe956a926a4ccfe67210cafe8188b60a"} err="failed to get container status \"0866c607fbe8d1fb480a9dbe3a0aca31fe956a926a4ccfe67210cafe8188b60a\": rpc error: code = NotFound desc = could not find container \"0866c607fbe8d1fb480a9dbe3a0aca31fe956a926a4ccfe67210cafe8188b60a\": container with ID starting with 0866c607fbe8d1fb480a9dbe3a0aca31fe956a926a4ccfe67210cafe8188b60a not found: ID does not exist" Mar 18 16:49:01.848514 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.848385 2536 scope.go:117] "RemoveContainer" containerID="bc8804455305327453ae7a4611ccbfac4205b6416e35786217b763457495f68d" Mar 18 16:49:01.848761 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.848718 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc8804455305327453ae7a4611ccbfac4205b6416e35786217b763457495f68d"} err="failed to get container status \"bc8804455305327453ae7a4611ccbfac4205b6416e35786217b763457495f68d\": rpc error: code = NotFound desc = could not find container \"bc8804455305327453ae7a4611ccbfac4205b6416e35786217b763457495f68d\": container with ID starting with bc8804455305327453ae7a4611ccbfac4205b6416e35786217b763457495f68d not found: ID does not exist" Mar 18 16:49:01.848761 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.848740 2536 scope.go:117] "RemoveContainer" containerID="e88e3863249bf414fa9d4d3a62ab8c792d4da149b3408350453212f59e45b004" Mar 18 16:49:01.849040 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.849019 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e88e3863249bf414fa9d4d3a62ab8c792d4da149b3408350453212f59e45b004"} err="failed to get container status \"e88e3863249bf414fa9d4d3a62ab8c792d4da149b3408350453212f59e45b004\": rpc error: code = NotFound desc = could not find container \"e88e3863249bf414fa9d4d3a62ab8c792d4da149b3408350453212f59e45b004\": container with ID starting with e88e3863249bf414fa9d4d3a62ab8c792d4da149b3408350453212f59e45b004 not found: ID does not exist" Mar 18 16:49:01.849040 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.849040 2536 scope.go:117] "RemoveContainer" containerID="f35f3e6c310503f1d7efd45005c817bdb81d28100c6ba95893fea781c894f315" Mar 18 16:49:01.849325 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.849302 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f35f3e6c310503f1d7efd45005c817bdb81d28100c6ba95893fea781c894f315"} err="failed to get container status \"f35f3e6c310503f1d7efd45005c817bdb81d28100c6ba95893fea781c894f315\": rpc error: code = NotFound desc = could not find container \"f35f3e6c310503f1d7efd45005c817bdb81d28100c6ba95893fea781c894f315\": container with ID starting with f35f3e6c310503f1d7efd45005c817bdb81d28100c6ba95893fea781c894f315 not found: ID does not exist" Mar 18 16:49:01.849928 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.849655 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Mar 18 16:49:01.854146 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.854123 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 16:49:01.910381 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.910355 2536 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec496183-6a46-48d9-9cd3-79969e0f09a5" path="/var/lib/kubelet/pods/ec496183-6a46-48d9-9cd3-79969e0f09a5/volumes" Mar 18 16:49:01.917489 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.917472 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a30a0e81-c0ff-4186-be97-f99f05750aad-config-out\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:01.917562 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.917502 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a30a0e81-c0ff-4186-be97-f99f05750aad-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:01.917562 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.917522 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a30a0e81-c0ff-4186-be97-f99f05750aad-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:01.917562 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.917539 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a30a0e81-c0ff-4186-be97-f99f05750aad-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:01.917562 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.917558 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a30a0e81-c0ff-4186-be97-f99f05750aad-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:01.917705 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.917581 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a30a0e81-c0ff-4186-be97-f99f05750aad-config\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:01.917705 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.917599 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a30a0e81-c0ff-4186-be97-f99f05750aad-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:01.917705 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.917615 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a30a0e81-c0ff-4186-be97-f99f05750aad-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:01.917705 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.917652 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a30a0e81-c0ff-4186-be97-f99f05750aad-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:01.917705 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.917678 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a30a0e81-c0ff-4186-be97-f99f05750aad-web-config\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:01.917705 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.917698 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a30a0e81-c0ff-4186-be97-f99f05750aad-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:01.917904 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.917730 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a30a0e81-c0ff-4186-be97-f99f05750aad-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:01.917904 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.917765 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a30a0e81-c0ff-4186-be97-f99f05750aad-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:01.917904 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.917808 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a30a0e81-c0ff-4186-be97-f99f05750aad-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:01.917904 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.917837 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a30a0e81-c0ff-4186-be97-f99f05750aad-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:01.917904 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.917857 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a30a0e81-c0ff-4186-be97-f99f05750aad-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:01.917904 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.917883 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a30a0e81-c0ff-4186-be97-f99f05750aad-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:01.917904 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:01.917901 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzbfg\" (UniqueName: \"kubernetes.io/projected/a30a0e81-c0ff-4186-be97-f99f05750aad-kube-api-access-zzbfg\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:02.019088 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.019050 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a30a0e81-c0ff-4186-be97-f99f05750aad-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:02.019251 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.019098 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a30a0e81-c0ff-4186-be97-f99f05750aad-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:02.019251 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.019125 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a30a0e81-c0ff-4186-be97-f99f05750aad-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:02.019251 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.019162 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a30a0e81-c0ff-4186-be97-f99f05750aad-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:02.019251 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.019195 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a30a0e81-c0ff-4186-be97-f99f05750aad-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:02.019251 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.019224 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a30a0e81-c0ff-4186-be97-f99f05750aad-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:02.019547 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.019397 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a30a0e81-c0ff-4186-be97-f99f05750aad-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:02.019547 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.019436 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zzbfg\" (UniqueName: \"kubernetes.io/projected/a30a0e81-c0ff-4186-be97-f99f05750aad-kube-api-access-zzbfg\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:02.019547 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.019483 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a30a0e81-c0ff-4186-be97-f99f05750aad-config-out\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:02.019547 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.019515 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a30a0e81-c0ff-4186-be97-f99f05750aad-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:02.021360 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.019517 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a30a0e81-c0ff-4186-be97-f99f05750aad-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:02.021360 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.019985 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a30a0e81-c0ff-4186-be97-f99f05750aad-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:02.021360 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.020014 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a30a0e81-c0ff-4186-be97-f99f05750aad-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:02.021360 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.020050 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a30a0e81-c0ff-4186-be97-f99f05750aad-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:02.021360 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.020076 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a30a0e81-c0ff-4186-be97-f99f05750aad-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:02.021360 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.020109 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a30a0e81-c0ff-4186-be97-f99f05750aad-config\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:02.021360 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.020137 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a30a0e81-c0ff-4186-be97-f99f05750aad-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:02.021360 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.020166 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a30a0e81-c0ff-4186-be97-f99f05750aad-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:02.021360 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.020197 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a30a0e81-c0ff-4186-be97-f99f05750aad-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:02.021360 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.020246 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a30a0e81-c0ff-4186-be97-f99f05750aad-web-config\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:02.021360 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.020321 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a30a0e81-c0ff-4186-be97-f99f05750aad-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:02.021360 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.021002 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a30a0e81-c0ff-4186-be97-f99f05750aad-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:02.021360 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.021017 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a30a0e81-c0ff-4186-be97-f99f05750aad-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:02.022879 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.022257 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a30a0e81-c0ff-4186-be97-f99f05750aad-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:02.022879 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.022389 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a30a0e81-c0ff-4186-be97-f99f05750aad-config-out\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:02.022879 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.022563 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a30a0e81-c0ff-4186-be97-f99f05750aad-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:02.022879 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.022846 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a30a0e81-c0ff-4186-be97-f99f05750aad-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:02.023699 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.023587 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a30a0e81-c0ff-4186-be97-f99f05750aad-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:02.024070 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.024046 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a30a0e81-c0ff-4186-be97-f99f05750aad-web-config\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:02.024147 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.024058 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a30a0e81-c0ff-4186-be97-f99f05750aad-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:02.024257 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.024239 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a30a0e81-c0ff-4186-be97-f99f05750aad-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:02.024914 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.024892 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a30a0e81-c0ff-4186-be97-f99f05750aad-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:02.024975 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.024922 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a30a0e81-c0ff-4186-be97-f99f05750aad-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:02.025208 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.025188 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a30a0e81-c0ff-4186-be97-f99f05750aad-config\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:02.025438 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.025422 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a30a0e81-c0ff-4186-be97-f99f05750aad-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:02.027193 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.027177 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzbfg\" (UniqueName: \"kubernetes.io/projected/a30a0e81-c0ff-4186-be97-f99f05750aad-kube-api-access-zzbfg\") pod \"prometheus-k8s-0\" (UID: \"a30a0e81-c0ff-4186-be97-f99f05750aad\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:02.153007 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.152922 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:02.276614 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.276581 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 16:49:02.279639 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:49:02.279610 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda30a0e81_c0ff_4186_be97_f99f05750aad.slice/crio-73da2da598bfd1de3e67e56a496a89b2685524f137b7231a6780082b3ee832de WatchSource:0}: Error finding container 73da2da598bfd1de3e67e56a496a89b2685524f137b7231a6780082b3ee832de: Status 404 returned error can't find the container with id 73da2da598bfd1de3e67e56a496a89b2685524f137b7231a6780082b3ee832de Mar 18 16:49:02.795217 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.795177 2536 generic.go:358] "Generic (PLEG): container finished" podID="a30a0e81-c0ff-4186-be97-f99f05750aad" containerID="da3f36f555edd255e25e11d378ba9e908f840b0c75c4f3d8980fc627c0c0e820" exitCode=0 Mar 18 16:49:02.795402 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.795261 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a30a0e81-c0ff-4186-be97-f99f05750aad","Type":"ContainerDied","Data":"da3f36f555edd255e25e11d378ba9e908f840b0c75c4f3d8980fc627c0c0e820"} Mar 18 16:49:02.795402 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:02.795307 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a30a0e81-c0ff-4186-be97-f99f05750aad","Type":"ContainerStarted","Data":"73da2da598bfd1de3e67e56a496a89b2685524f137b7231a6780082b3ee832de"} Mar 18 16:49:03.801412 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:03.801373 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a30a0e81-c0ff-4186-be97-f99f05750aad","Type":"ContainerStarted","Data":"c161f6358325bf6225215f95a38a9d64b043fe9039fcbc67b0e8e623ba99457f"} Mar 18 16:49:03.801412 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:03.801414 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a30a0e81-c0ff-4186-be97-f99f05750aad","Type":"ContainerStarted","Data":"98f7a79a7940b6951aa87eedf1654ffa7124e20d63658ec9ad98fdd3d707d6bf"} Mar 18 16:49:03.801828 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:03.801430 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a30a0e81-c0ff-4186-be97-f99f05750aad","Type":"ContainerStarted","Data":"4fba2fafc7d6128a1334ae539ace89e45016833ed448d7d033d57f61019b43a3"} Mar 18 16:49:03.801828 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:03.801443 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a30a0e81-c0ff-4186-be97-f99f05750aad","Type":"ContainerStarted","Data":"0a0d4050957b2dbc67f03d512b769705aa70b139d1c8358a28eaf3c9b7b64693"} Mar 18 16:49:03.801828 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:03.801454 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a30a0e81-c0ff-4186-be97-f99f05750aad","Type":"ContainerStarted","Data":"a49e3c95cb72bfc9cb5bf4ab27147277a07fd6a81ac373cea14c52251f4cea05"} Mar 18 16:49:03.801828 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:03.801466 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a30a0e81-c0ff-4186-be97-f99f05750aad","Type":"ContainerStarted","Data":"5f82e9d95794b0636819decf649ca27299f5687fb121b0d13090fd7a2c90b35c"} Mar 18 16:49:03.829818 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:03.829758 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.829739676 podStartE2EDuration="2.829739676s" podCreationTimestamp="2026-03-18 16:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:49:03.82853771 +0000 UTC m=+266.602112297" watchObservedRunningTime="2026-03-18 16:49:03.829739676 +0000 UTC m=+266.603314263" Mar 18 16:49:07.153875 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:07.153818 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:37.766688 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:37.766660 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-mx9sz_95e73a3f-8a85-403f-b00b-17524a80b500/console-operator/2.log" Mar 18 16:49:37.767731 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:37.767706 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-mx9sz_95e73a3f-8a85-403f-b00b-17524a80b500/console-operator/2.log" Mar 18 16:49:37.773614 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:37.773590 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ts4mw_3e6fe5a1-3b13-4d51-a141-1280e1b25b3a/ovn-acl-logging/0.log" Mar 18 16:49:37.774800 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:49:37.774782 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ts4mw_3e6fe5a1-3b13-4d51-a141-1280e1b25b3a/ovn-acl-logging/0.log" Mar 18 16:50:02.154015 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:50:02.153983 2536 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:50:02.169052 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:50:02.169029 2536 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:50:02.985934 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:50:02.985908 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:53:07.404702 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:07.404666 2536 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-2q2pd"] Mar 18 16:53:07.407726 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:07.407711 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-2q2pd" Mar 18 16:53:07.410084 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:07.410054 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-bcxgc\"" Mar 18 16:53:07.410227 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:07.410141 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Mar 18 16:53:07.411694 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:07.411675 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Mar 18 16:53:07.424401 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:07.424376 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-2q2pd"] Mar 18 16:53:07.488107 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:07.488074 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87wd5\" (UniqueName: \"kubernetes.io/projected/73676752-0b3f-4a59-ae62-78bab0bbe0dc-kube-api-access-87wd5\") pod \"cert-manager-cainjector-8966b78d4-2q2pd\" (UID: \"73676752-0b3f-4a59-ae62-78bab0bbe0dc\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-2q2pd" Mar 18 16:53:07.488253 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:07.488145 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/73676752-0b3f-4a59-ae62-78bab0bbe0dc-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-2q2pd\" (UID: \"73676752-0b3f-4a59-ae62-78bab0bbe0dc\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-2q2pd" Mar 18 16:53:07.589110 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:07.589075 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-87wd5\" (UniqueName: \"kubernetes.io/projected/73676752-0b3f-4a59-ae62-78bab0bbe0dc-kube-api-access-87wd5\") pod \"cert-manager-cainjector-8966b78d4-2q2pd\" (UID: \"73676752-0b3f-4a59-ae62-78bab0bbe0dc\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-2q2pd" Mar 18 16:53:07.589312 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:07.589136 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/73676752-0b3f-4a59-ae62-78bab0bbe0dc-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-2q2pd\" (UID: \"73676752-0b3f-4a59-ae62-78bab0bbe0dc\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-2q2pd" Mar 18 16:53:07.596865 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:07.596828 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/73676752-0b3f-4a59-ae62-78bab0bbe0dc-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-2q2pd\" (UID: \"73676752-0b3f-4a59-ae62-78bab0bbe0dc\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-2q2pd" Mar 18 16:53:07.596988 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:07.596942 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-87wd5\" (UniqueName: \"kubernetes.io/projected/73676752-0b3f-4a59-ae62-78bab0bbe0dc-kube-api-access-87wd5\") pod \"cert-manager-cainjector-8966b78d4-2q2pd\" (UID: \"73676752-0b3f-4a59-ae62-78bab0bbe0dc\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-2q2pd" Mar 18 16:53:07.724583 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:07.724556 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-2q2pd" Mar 18 16:53:07.843512 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:07.843483 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-2q2pd"] Mar 18 16:53:07.846554 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:53:07.846523 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73676752_0b3f_4a59_ae62_78bab0bbe0dc.slice/crio-3155a82322c7bcc99af6d84d5aeb257d360c8402e97d9f8f6fba60ea2786efbd WatchSource:0}: Error finding container 3155a82322c7bcc99af6d84d5aeb257d360c8402e97d9f8f6fba60ea2786efbd: Status 404 returned error can't find the container with id 3155a82322c7bcc99af6d84d5aeb257d360c8402e97d9f8f6fba60ea2786efbd Mar 18 16:53:07.848403 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:07.848383 2536 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:53:08.492105 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:08.492059 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-2q2pd" event={"ID":"73676752-0b3f-4a59-ae62-78bab0bbe0dc","Type":"ContainerStarted","Data":"3155a82322c7bcc99af6d84d5aeb257d360c8402e97d9f8f6fba60ea2786efbd"} Mar 18 16:53:11.502195 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:11.502158 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-2q2pd" event={"ID":"73676752-0b3f-4a59-ae62-78bab0bbe0dc","Type":"ContainerStarted","Data":"cde099f800e20eea696545ad7e631bc9d62a548bfe6ac145eafe41a0433fabcf"} Mar 18 16:53:11.519831 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:11.519790 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-2q2pd" podStartSLOduration=1.313685888 podStartE2EDuration="4.519775526s" podCreationTimestamp="2026-03-18 16:53:07 +0000 UTC" firstStartedPulling="2026-03-18 16:53:07.848542847 +0000 UTC m=+510.622117411" lastFinishedPulling="2026-03-18 16:53:11.054632485 +0000 UTC m=+513.828207049" observedRunningTime="2026-03-18 16:53:11.518884909 +0000 UTC m=+514.292459505" watchObservedRunningTime="2026-03-18 16:53:11.519775526 +0000 UTC m=+514.293350112" Mar 18 16:53:45.733693 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:45.733609 2536 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gxrtl"] Mar 18 16:53:45.741136 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:45.741109 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gxrtl" Mar 18 16:53:45.743205 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:45.743187 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Mar 18 16:53:45.743617 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:45.743599 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"openshift-ai-inference-openshift-default-dockercfg-slqcg\"" Mar 18 16:53:45.747525 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:45.747501 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gxrtl"] Mar 18 16:53:45.802258 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:45.802228 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/3c55b02a-defe-494b-9c4b-3312f62ed11b-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gxrtl\" (UID: \"3c55b02a-defe-494b-9c4b-3312f62ed11b\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gxrtl" Mar 18 16:53:45.802428 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:45.802287 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/3c55b02a-defe-494b-9c4b-3312f62ed11b-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gxrtl\" (UID: \"3c55b02a-defe-494b-9c4b-3312f62ed11b\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gxrtl" Mar 18 16:53:45.802428 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:45.802370 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/3c55b02a-defe-494b-9c4b-3312f62ed11b-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gxrtl\" (UID: \"3c55b02a-defe-494b-9c4b-3312f62ed11b\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gxrtl" Mar 18 16:53:45.802428 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:45.802414 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/3c55b02a-defe-494b-9c4b-3312f62ed11b-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gxrtl\" (UID: \"3c55b02a-defe-494b-9c4b-3312f62ed11b\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gxrtl" Mar 18 16:53:45.802555 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:45.802454 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/3c55b02a-defe-494b-9c4b-3312f62ed11b-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gxrtl\" (UID: \"3c55b02a-defe-494b-9c4b-3312f62ed11b\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gxrtl" Mar 18 16:53:45.802555 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:45.802486 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/3c55b02a-defe-494b-9c4b-3312f62ed11b-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gxrtl\" (UID: \"3c55b02a-defe-494b-9c4b-3312f62ed11b\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gxrtl" Mar 18 16:53:45.802555 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:45.802507 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/3c55b02a-defe-494b-9c4b-3312f62ed11b-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gxrtl\" (UID: \"3c55b02a-defe-494b-9c4b-3312f62ed11b\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gxrtl" Mar 18 16:53:45.802555 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:45.802524 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/3c55b02a-defe-494b-9c4b-3312f62ed11b-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gxrtl\" (UID: \"3c55b02a-defe-494b-9c4b-3312f62ed11b\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gxrtl" Mar 18 16:53:45.802555 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:45.802548 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5dwv\" (UniqueName: \"kubernetes.io/projected/3c55b02a-defe-494b-9c4b-3312f62ed11b-kube-api-access-z5dwv\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gxrtl\" (UID: \"3c55b02a-defe-494b-9c4b-3312f62ed11b\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gxrtl" Mar 18 16:53:45.903598 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:45.903559 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/3c55b02a-defe-494b-9c4b-3312f62ed11b-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gxrtl\" (UID: \"3c55b02a-defe-494b-9c4b-3312f62ed11b\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gxrtl" Mar 18 16:53:45.903762 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:45.903620 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/3c55b02a-defe-494b-9c4b-3312f62ed11b-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gxrtl\" (UID: \"3c55b02a-defe-494b-9c4b-3312f62ed11b\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gxrtl" Mar 18 16:53:45.903762 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:45.903656 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/3c55b02a-defe-494b-9c4b-3312f62ed11b-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gxrtl\" (UID: \"3c55b02a-defe-494b-9c4b-3312f62ed11b\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gxrtl" Mar 18 16:53:45.903762 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:45.903685 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/3c55b02a-defe-494b-9c4b-3312f62ed11b-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gxrtl\" (UID: \"3c55b02a-defe-494b-9c4b-3312f62ed11b\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gxrtl" Mar 18 16:53:45.903762 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:45.903711 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/3c55b02a-defe-494b-9c4b-3312f62ed11b-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gxrtl\" (UID: \"3c55b02a-defe-494b-9c4b-3312f62ed11b\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gxrtl" Mar 18 16:53:45.903762 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:45.903735 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z5dwv\" (UniqueName: \"kubernetes.io/projected/3c55b02a-defe-494b-9c4b-3312f62ed11b-kube-api-access-z5dwv\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gxrtl\" (UID: \"3c55b02a-defe-494b-9c4b-3312f62ed11b\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gxrtl" Mar 18 16:53:45.903995 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:45.903870 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/3c55b02a-defe-494b-9c4b-3312f62ed11b-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gxrtl\" (UID: \"3c55b02a-defe-494b-9c4b-3312f62ed11b\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gxrtl" Mar 18 16:53:45.903995 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:45.903946 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/3c55b02a-defe-494b-9c4b-3312f62ed11b-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gxrtl\" (UID: \"3c55b02a-defe-494b-9c4b-3312f62ed11b\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gxrtl" Mar 18 16:53:45.904103 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:45.903999 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/3c55b02a-defe-494b-9c4b-3312f62ed11b-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gxrtl\" (UID: \"3c55b02a-defe-494b-9c4b-3312f62ed11b\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gxrtl" Mar 18 16:53:45.904103 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:45.904088 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/3c55b02a-defe-494b-9c4b-3312f62ed11b-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gxrtl\" (UID: \"3c55b02a-defe-494b-9c4b-3312f62ed11b\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gxrtl" Mar 18 16:53:45.904255 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:45.904230 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/3c55b02a-defe-494b-9c4b-3312f62ed11b-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gxrtl\" (UID: \"3c55b02a-defe-494b-9c4b-3312f62ed11b\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gxrtl" Mar 18 16:53:45.904367 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:45.904248 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/3c55b02a-defe-494b-9c4b-3312f62ed11b-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gxrtl\" (UID: \"3c55b02a-defe-494b-9c4b-3312f62ed11b\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gxrtl" Mar 18 16:53:45.904460 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:45.904435 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/3c55b02a-defe-494b-9c4b-3312f62ed11b-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gxrtl\" (UID: \"3c55b02a-defe-494b-9c4b-3312f62ed11b\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gxrtl" Mar 18 16:53:45.904570 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:45.904516 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/3c55b02a-defe-494b-9c4b-3312f62ed11b-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gxrtl\" (UID: \"3c55b02a-defe-494b-9c4b-3312f62ed11b\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gxrtl" Mar 18 16:53:45.906117 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:45.906097 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/3c55b02a-defe-494b-9c4b-3312f62ed11b-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gxrtl\" (UID: \"3c55b02a-defe-494b-9c4b-3312f62ed11b\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gxrtl" Mar 18 16:53:45.906509 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:45.906488 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/3c55b02a-defe-494b-9c4b-3312f62ed11b-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gxrtl\" (UID: \"3c55b02a-defe-494b-9c4b-3312f62ed11b\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gxrtl" Mar 18 16:53:45.917218 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:45.917197 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/3c55b02a-defe-494b-9c4b-3312f62ed11b-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gxrtl\" (UID: \"3c55b02a-defe-494b-9c4b-3312f62ed11b\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gxrtl" Mar 18 16:53:45.917314 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:45.917231 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5dwv\" (UniqueName: \"kubernetes.io/projected/3c55b02a-defe-494b-9c4b-3312f62ed11b-kube-api-access-z5dwv\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gxrtl\" (UID: \"3c55b02a-defe-494b-9c4b-3312f62ed11b\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gxrtl" Mar 18 16:53:46.053170 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:46.053105 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gxrtl" Mar 18 16:53:46.177866 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:46.177836 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gxrtl"] Mar 18 16:53:46.179665 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:53:46.179631 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c55b02a_defe_494b_9c4b_3312f62ed11b.slice/crio-b538539d36e5cdedb147979b95008842e7745d3809d1d18eecc245764a594f66 WatchSource:0}: Error finding container b538539d36e5cdedb147979b95008842e7745d3809d1d18eecc245764a594f66: Status 404 returned error can't find the container with id b538539d36e5cdedb147979b95008842e7745d3809d1d18eecc245764a594f66 Mar 18 16:53:46.605292 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:46.605238 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gxrtl" event={"ID":"3c55b02a-defe-494b-9c4b-3312f62ed11b","Type":"ContainerStarted","Data":"b538539d36e5cdedb147979b95008842e7745d3809d1d18eecc245764a594f66"} Mar 18 16:53:49.040417 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:49.040380 2536 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Mar 18 16:53:49.040698 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:49.040453 2536 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Mar 18 16:53:49.040698 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:49.040499 2536 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Mar 18 16:53:49.616138 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:49.616105 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gxrtl" event={"ID":"3c55b02a-defe-494b-9c4b-3312f62ed11b","Type":"ContainerStarted","Data":"90e4da958d246ef81ded707af5f937d90ad67a579c93fe4228d042411399a669"} Mar 18 16:53:49.635716 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:49.635668 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gxrtl" podStartSLOduration=1.776999516 podStartE2EDuration="4.635653302s" podCreationTimestamp="2026-03-18 16:53:45 +0000 UTC" firstStartedPulling="2026-03-18 16:53:46.181464308 +0000 UTC m=+548.955038876" lastFinishedPulling="2026-03-18 16:53:49.040118097 +0000 UTC m=+551.813692662" observedRunningTime="2026-03-18 16:53:49.634293994 +0000 UTC m=+552.407868576" watchObservedRunningTime="2026-03-18 16:53:49.635653302 +0000 UTC m=+552.409227887" Mar 18 16:53:50.054127 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:50.054086 2536 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gxrtl" Mar 18 16:53:50.058568 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:50.058547 2536 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gxrtl" Mar 18 16:53:50.619169 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:50.619132 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gxrtl" Mar 18 16:53:50.620106 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:50.620089 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gxrtl" Mar 18 16:53:59.098299 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:59.098252 2536 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-84978b767b-9bt9l"] Mar 18 16:53:59.101533 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:59.101516 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-84978b767b-9bt9l" Mar 18 16:53:59.103442 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:59.103420 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:53:59.104096 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:59.104073 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Mar 18 16:53:59.104096 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:59.104094 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Mar 18 16:53:59.104096 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:59.104097 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-5lx2r\"" Mar 18 16:53:59.104300 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:59.104106 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Mar 18 16:53:59.104300 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:59.104210 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Mar 18 16:53:59.111765 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:59.111743 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-84978b767b-9bt9l"] Mar 18 16:53:59.216624 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:59.216588 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d80b0f51-0292-4738-a17d-217b8bebbe6f-cert\") pod \"lws-controller-manager-84978b767b-9bt9l\" (UID: \"d80b0f51-0292-4738-a17d-217b8bebbe6f\") " pod="openshift-lws-operator/lws-controller-manager-84978b767b-9bt9l" Mar 18 16:53:59.216828 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:59.216667 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj5qg\" (UniqueName: \"kubernetes.io/projected/d80b0f51-0292-4738-a17d-217b8bebbe6f-kube-api-access-jj5qg\") pod \"lws-controller-manager-84978b767b-9bt9l\" (UID: \"d80b0f51-0292-4738-a17d-217b8bebbe6f\") " pod="openshift-lws-operator/lws-controller-manager-84978b767b-9bt9l" Mar 18 16:53:59.216828 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:59.216760 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/d80b0f51-0292-4738-a17d-217b8bebbe6f-metrics-cert\") pod \"lws-controller-manager-84978b767b-9bt9l\" (UID: \"d80b0f51-0292-4738-a17d-217b8bebbe6f\") " pod="openshift-lws-operator/lws-controller-manager-84978b767b-9bt9l" Mar 18 16:53:59.216828 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:59.216801 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/d80b0f51-0292-4738-a17d-217b8bebbe6f-manager-config\") pod \"lws-controller-manager-84978b767b-9bt9l\" (UID: \"d80b0f51-0292-4738-a17d-217b8bebbe6f\") " pod="openshift-lws-operator/lws-controller-manager-84978b767b-9bt9l" Mar 18 16:53:59.317838 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:59.317804 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jj5qg\" (UniqueName: \"kubernetes.io/projected/d80b0f51-0292-4738-a17d-217b8bebbe6f-kube-api-access-jj5qg\") pod \"lws-controller-manager-84978b767b-9bt9l\" (UID: \"d80b0f51-0292-4738-a17d-217b8bebbe6f\") " pod="openshift-lws-operator/lws-controller-manager-84978b767b-9bt9l" Mar 18 16:53:59.318011 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:59.317875 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/d80b0f51-0292-4738-a17d-217b8bebbe6f-metrics-cert\") pod \"lws-controller-manager-84978b767b-9bt9l\" (UID: \"d80b0f51-0292-4738-a17d-217b8bebbe6f\") " pod="openshift-lws-operator/lws-controller-manager-84978b767b-9bt9l" Mar 18 16:53:59.318011 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:59.317919 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/d80b0f51-0292-4738-a17d-217b8bebbe6f-manager-config\") pod \"lws-controller-manager-84978b767b-9bt9l\" (UID: \"d80b0f51-0292-4738-a17d-217b8bebbe6f\") " pod="openshift-lws-operator/lws-controller-manager-84978b767b-9bt9l" Mar 18 16:53:59.318011 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:59.317970 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d80b0f51-0292-4738-a17d-217b8bebbe6f-cert\") pod \"lws-controller-manager-84978b767b-9bt9l\" (UID: \"d80b0f51-0292-4738-a17d-217b8bebbe6f\") " pod="openshift-lws-operator/lws-controller-manager-84978b767b-9bt9l" Mar 18 16:53:59.318692 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:59.318662 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/d80b0f51-0292-4738-a17d-217b8bebbe6f-manager-config\") pod \"lws-controller-manager-84978b767b-9bt9l\" (UID: \"d80b0f51-0292-4738-a17d-217b8bebbe6f\") " pod="openshift-lws-operator/lws-controller-manager-84978b767b-9bt9l" Mar 18 16:53:59.320457 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:59.320436 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/d80b0f51-0292-4738-a17d-217b8bebbe6f-metrics-cert\") pod \"lws-controller-manager-84978b767b-9bt9l\" (UID: \"d80b0f51-0292-4738-a17d-217b8bebbe6f\") " pod="openshift-lws-operator/lws-controller-manager-84978b767b-9bt9l" Mar 18 16:53:59.320457 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:59.320453 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d80b0f51-0292-4738-a17d-217b8bebbe6f-cert\") pod \"lws-controller-manager-84978b767b-9bt9l\" (UID: \"d80b0f51-0292-4738-a17d-217b8bebbe6f\") " pod="openshift-lws-operator/lws-controller-manager-84978b767b-9bt9l" Mar 18 16:53:59.329921 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:59.329898 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj5qg\" (UniqueName: \"kubernetes.io/projected/d80b0f51-0292-4738-a17d-217b8bebbe6f-kube-api-access-jj5qg\") pod \"lws-controller-manager-84978b767b-9bt9l\" (UID: \"d80b0f51-0292-4738-a17d-217b8bebbe6f\") " pod="openshift-lws-operator/lws-controller-manager-84978b767b-9bt9l" Mar 18 16:53:59.411085 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:59.410989 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-84978b767b-9bt9l" Mar 18 16:53:59.531865 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:59.531831 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-84978b767b-9bt9l"] Mar 18 16:53:59.535913 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:53:59.535885 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd80b0f51_0292_4738_a17d_217b8bebbe6f.slice/crio-c9bad4e9eb94d2e3689e6afbd47b16d1d10d4cce79772dc79282dfb0a1af3d64 WatchSource:0}: Error finding container c9bad4e9eb94d2e3689e6afbd47b16d1d10d4cce79772dc79282dfb0a1af3d64: Status 404 returned error can't find the container with id c9bad4e9eb94d2e3689e6afbd47b16d1d10d4cce79772dc79282dfb0a1af3d64 Mar 18 16:53:59.646147 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:53:59.646105 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-84978b767b-9bt9l" event={"ID":"d80b0f51-0292-4738-a17d-217b8bebbe6f","Type":"ContainerStarted","Data":"c9bad4e9eb94d2e3689e6afbd47b16d1d10d4cce79772dc79282dfb0a1af3d64"} Mar 18 16:54:02.656380 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:02.656337 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-84978b767b-9bt9l" event={"ID":"d80b0f51-0292-4738-a17d-217b8bebbe6f","Type":"ContainerStarted","Data":"89efdf5405caa1bad49564beed3365f1244cf566930dacdf00478ed510fb6ab9"} Mar 18 16:54:02.656750 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:02.656470 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-84978b767b-9bt9l" Mar 18 16:54:02.673376 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:02.673331 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-84978b767b-9bt9l" podStartSLOduration=1.137437673 podStartE2EDuration="3.673317869s" podCreationTimestamp="2026-03-18 16:53:59 +0000 UTC" firstStartedPulling="2026-03-18 16:53:59.53773156 +0000 UTC m=+562.311306127" lastFinishedPulling="2026-03-18 16:54:02.073611758 +0000 UTC m=+564.847186323" observedRunningTime="2026-03-18 16:54:02.671183722 +0000 UTC m=+565.444758323" watchObservedRunningTime="2026-03-18 16:54:02.673317869 +0000 UTC m=+565.446892499" Mar 18 16:54:13.661598 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:13.661557 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-84978b767b-9bt9l" Mar 18 16:54:24.393847 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:24.393815 2536 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-5zl8n"] Mar 18 16:54:24.396735 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:24.396719 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-5zl8n" Mar 18 16:54:24.398969 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:24.398933 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Mar 18 16:54:24.398969 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:24.398957 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Mar 18 16:54:24.399198 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:24.399040 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-sc8zm\"" Mar 18 16:54:24.399198 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:24.399044 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Mar 18 16:54:24.406175 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:24.406152 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-5zl8n"] Mar 18 16:54:24.431721 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:24.431688 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbhcg\" (UniqueName: \"kubernetes.io/projected/4ac0b078-2731-455d-b061-ed53f3521660-kube-api-access-qbhcg\") pod \"dns-operator-controller-manager-844548ff4c-5zl8n\" (UID: \"4ac0b078-2731-455d-b061-ed53f3521660\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-5zl8n" Mar 18 16:54:24.532738 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:24.532698 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qbhcg\" (UniqueName: \"kubernetes.io/projected/4ac0b078-2731-455d-b061-ed53f3521660-kube-api-access-qbhcg\") pod \"dns-operator-controller-manager-844548ff4c-5zl8n\" (UID: \"4ac0b078-2731-455d-b061-ed53f3521660\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-5zl8n" Mar 18 16:54:24.540070 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:24.540039 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbhcg\" (UniqueName: \"kubernetes.io/projected/4ac0b078-2731-455d-b061-ed53f3521660-kube-api-access-qbhcg\") pod \"dns-operator-controller-manager-844548ff4c-5zl8n\" (UID: \"4ac0b078-2731-455d-b061-ed53f3521660\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-5zl8n" Mar 18 16:54:24.707421 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:24.707390 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-5zl8n" Mar 18 16:54:24.840453 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:24.840425 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-5zl8n"] Mar 18 16:54:24.843111 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:54:24.843080 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ac0b078_2731_455d_b061_ed53f3521660.slice/crio-8e3db415762d00c40484c2b2e83ce972eaa49b0a2481af26d79092cfad5c1a9c WatchSource:0}: Error finding container 8e3db415762d00c40484c2b2e83ce972eaa49b0a2481af26d79092cfad5c1a9c: Status 404 returned error can't find the container with id 8e3db415762d00c40484c2b2e83ce972eaa49b0a2481af26d79092cfad5c1a9c Mar 18 16:54:25.731991 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:25.731949 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-5zl8n" event={"ID":"4ac0b078-2731-455d-b061-ed53f3521660","Type":"ContainerStarted","Data":"8e3db415762d00c40484c2b2e83ce972eaa49b0a2481af26d79092cfad5c1a9c"} Mar 18 16:54:27.739947 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:27.739921 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-5zl8n" event={"ID":"4ac0b078-2731-455d-b061-ed53f3521660","Type":"ContainerStarted","Data":"9a70c8e272bb27d1c6e9cab7428d74a4a6a3f654251a7788de462ebb3578bbde"} Mar 18 16:54:27.740254 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:27.740049 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-5zl8n" Mar 18 16:54:27.764020 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:27.763974 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-5zl8n" podStartSLOduration=0.985891314 podStartE2EDuration="3.763958874s" podCreationTimestamp="2026-03-18 16:54:24 +0000 UTC" firstStartedPulling="2026-03-18 16:54:24.845257795 +0000 UTC m=+587.618832374" lastFinishedPulling="2026-03-18 16:54:27.623325368 +0000 UTC m=+590.396899934" observedRunningTime="2026-03-18 16:54:27.762769201 +0000 UTC m=+590.536343789" watchObservedRunningTime="2026-03-18 16:54:27.763958874 +0000 UTC m=+590.537533461" Mar 18 16:54:34.514750 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:34.514716 2536 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-6wzcj"] Mar 18 16:54:34.519125 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:34.519106 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-6wzcj" Mar 18 16:54:34.521143 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:34.521119 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Mar 18 16:54:34.521143 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:34.521132 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-sd7tx\"" Mar 18 16:54:34.525779 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:34.525754 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-6wzcj"] Mar 18 16:54:34.616113 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:34.616082 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-6wzcj"] Mar 18 16:54:34.623321 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:34.623295 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8fss\" (UniqueName: \"kubernetes.io/projected/aecf286a-11a3-4f4d-aa45-54aa5f55f3ce-kube-api-access-s8fss\") pod \"limitador-limitador-64c8f475fb-6wzcj\" (UID: \"aecf286a-11a3-4f4d-aa45-54aa5f55f3ce\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-6wzcj" Mar 18 16:54:34.623450 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:34.623343 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/aecf286a-11a3-4f4d-aa45-54aa5f55f3ce-config-file\") pod \"limitador-limitador-64c8f475fb-6wzcj\" (UID: \"aecf286a-11a3-4f4d-aa45-54aa5f55f3ce\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-6wzcj" Mar 18 16:54:34.724253 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:34.724221 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s8fss\" (UniqueName: \"kubernetes.io/projected/aecf286a-11a3-4f4d-aa45-54aa5f55f3ce-kube-api-access-s8fss\") pod \"limitador-limitador-64c8f475fb-6wzcj\" (UID: \"aecf286a-11a3-4f4d-aa45-54aa5f55f3ce\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-6wzcj" Mar 18 16:54:34.724253 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:34.724257 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/aecf286a-11a3-4f4d-aa45-54aa5f55f3ce-config-file\") pod \"limitador-limitador-64c8f475fb-6wzcj\" (UID: \"aecf286a-11a3-4f4d-aa45-54aa5f55f3ce\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-6wzcj" Mar 18 16:54:34.724934 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:34.724907 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/aecf286a-11a3-4f4d-aa45-54aa5f55f3ce-config-file\") pod \"limitador-limitador-64c8f475fb-6wzcj\" (UID: \"aecf286a-11a3-4f4d-aa45-54aa5f55f3ce\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-6wzcj" Mar 18 16:54:34.734502 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:34.734479 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8fss\" (UniqueName: \"kubernetes.io/projected/aecf286a-11a3-4f4d-aa45-54aa5f55f3ce-kube-api-access-s8fss\") pod \"limitador-limitador-64c8f475fb-6wzcj\" (UID: \"aecf286a-11a3-4f4d-aa45-54aa5f55f3ce\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-6wzcj" Mar 18 16:54:34.830089 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:34.829991 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-6wzcj" Mar 18 16:54:34.950040 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:34.950015 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-6wzcj"] Mar 18 16:54:34.952858 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:54:34.952830 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaecf286a_11a3_4f4d_aa45_54aa5f55f3ce.slice/crio-d638c7998de03ce76481b849fc6df7cde500d9822868a37a31d515965a22ada3 WatchSource:0}: Error finding container d638c7998de03ce76481b849fc6df7cde500d9822868a37a31d515965a22ada3: Status 404 returned error can't find the container with id d638c7998de03ce76481b849fc6df7cde500d9822868a37a31d515965a22ada3 Mar 18 16:54:35.144599 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:35.144519 2536 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-4699c"] Mar 18 16:54:35.147685 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:35.147664 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-4699c" Mar 18 16:54:35.149490 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:35.149469 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Mar 18 16:54:35.149599 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:35.149517 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Mar 18 16:54:35.157107 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:35.157085 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-4699c"] Mar 18 16:54:35.229310 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:35.229251 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3f985d57-b8fc-45bb-9d34-5007209116e4-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-4699c\" (UID: \"3f985d57-b8fc-45bb-9d34-5007209116e4\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-4699c" Mar 18 16:54:35.229469 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:35.229341 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3f985d57-b8fc-45bb-9d34-5007209116e4-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-4699c\" (UID: \"3f985d57-b8fc-45bb-9d34-5007209116e4\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-4699c" Mar 18 16:54:35.229469 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:35.229403 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h59wx\" (UniqueName: \"kubernetes.io/projected/3f985d57-b8fc-45bb-9d34-5007209116e4-kube-api-access-h59wx\") pod \"kuadrant-console-plugin-6c886788f8-4699c\" (UID: \"3f985d57-b8fc-45bb-9d34-5007209116e4\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-4699c" Mar 18 16:54:35.330614 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:35.330570 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h59wx\" (UniqueName: \"kubernetes.io/projected/3f985d57-b8fc-45bb-9d34-5007209116e4-kube-api-access-h59wx\") pod \"kuadrant-console-plugin-6c886788f8-4699c\" (UID: \"3f985d57-b8fc-45bb-9d34-5007209116e4\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-4699c" Mar 18 16:54:35.330793 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:35.330675 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3f985d57-b8fc-45bb-9d34-5007209116e4-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-4699c\" (UID: \"3f985d57-b8fc-45bb-9d34-5007209116e4\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-4699c" Mar 18 16:54:35.330793 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:35.330709 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3f985d57-b8fc-45bb-9d34-5007209116e4-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-4699c\" (UID: \"3f985d57-b8fc-45bb-9d34-5007209116e4\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-4699c" Mar 18 16:54:35.331348 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:35.331329 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3f985d57-b8fc-45bb-9d34-5007209116e4-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-4699c\" (UID: \"3f985d57-b8fc-45bb-9d34-5007209116e4\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-4699c" Mar 18 16:54:35.333194 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:35.333171 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3f985d57-b8fc-45bb-9d34-5007209116e4-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-4699c\" (UID: \"3f985d57-b8fc-45bb-9d34-5007209116e4\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-4699c" Mar 18 16:54:35.345571 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:35.345548 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h59wx\" (UniqueName: \"kubernetes.io/projected/3f985d57-b8fc-45bb-9d34-5007209116e4-kube-api-access-h59wx\") pod \"kuadrant-console-plugin-6c886788f8-4699c\" (UID: \"3f985d57-b8fc-45bb-9d34-5007209116e4\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-4699c" Mar 18 16:54:35.458631 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:35.458599 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-4699c" Mar 18 16:54:35.578402 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:35.578376 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-4699c"] Mar 18 16:54:35.580885 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:54:35.580858 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f985d57_b8fc_45bb_9d34_5007209116e4.slice/crio-3f9aa3ee4bb067e20471c363109872f2b7f5e624400c814fc9af6123730443c6 WatchSource:0}: Error finding container 3f9aa3ee4bb067e20471c363109872f2b7f5e624400c814fc9af6123730443c6: Status 404 returned error can't find the container with id 3f9aa3ee4bb067e20471c363109872f2b7f5e624400c814fc9af6123730443c6 Mar 18 16:54:35.766314 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:35.766224 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-4699c" event={"ID":"3f985d57-b8fc-45bb-9d34-5007209116e4","Type":"ContainerStarted","Data":"3f9aa3ee4bb067e20471c363109872f2b7f5e624400c814fc9af6123730443c6"} Mar 18 16:54:35.767294 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:35.767251 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-6wzcj" event={"ID":"aecf286a-11a3-4f4d-aa45-54aa5f55f3ce","Type":"ContainerStarted","Data":"d638c7998de03ce76481b849fc6df7cde500d9822868a37a31d515965a22ada3"} Mar 18 16:54:38.746051 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:38.746014 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-5zl8n" Mar 18 16:54:40.271682 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:40.271629 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-mx9sz_95e73a3f-8a85-403f-b00b-17524a80b500/console-operator/2.log" Mar 18 16:54:40.272058 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:40.271954 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-mx9sz_95e73a3f-8a85-403f-b00b-17524a80b500/console-operator/2.log" Mar 18 16:54:40.276848 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:40.276827 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ts4mw_3e6fe5a1-3b13-4d51-a141-1280e1b25b3a/ovn-acl-logging/0.log" Mar 18 16:54:40.277455 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:40.277434 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ts4mw_3e6fe5a1-3b13-4d51-a141-1280e1b25b3a/ovn-acl-logging/0.log" Mar 18 16:54:40.787657 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:40.787627 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-4699c" event={"ID":"3f985d57-b8fc-45bb-9d34-5007209116e4","Type":"ContainerStarted","Data":"8c1d12c9480f0bdcbccf46c5272feed705aac0b5102225d066523bee494db575"} Mar 18 16:54:40.788813 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:40.788787 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-6wzcj" event={"ID":"aecf286a-11a3-4f4d-aa45-54aa5f55f3ce","Type":"ContainerStarted","Data":"68c4740bc169cba9ceaaf1f0ffd8aa8b640b1860ac692114ed3c9ac3628c233f"} Mar 18 16:54:40.788930 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:40.788924 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-6wzcj" Mar 18 16:54:40.802730 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:40.802692 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-4699c" podStartSLOduration=1.055493351 podStartE2EDuration="5.802679502s" podCreationTimestamp="2026-03-18 16:54:35 +0000 UTC" firstStartedPulling="2026-03-18 16:54:35.582216565 +0000 UTC m=+598.355791140" lastFinishedPulling="2026-03-18 16:54:40.329402726 +0000 UTC m=+603.102977291" observedRunningTime="2026-03-18 16:54:40.801016854 +0000 UTC m=+603.574591462" watchObservedRunningTime="2026-03-18 16:54:40.802679502 +0000 UTC m=+603.576254088" Mar 18 16:54:40.817046 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:40.817001 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-64c8f475fb-6wzcj" podStartSLOduration=1.518929885 podStartE2EDuration="6.816986585s" podCreationTimestamp="2026-03-18 16:54:34 +0000 UTC" firstStartedPulling="2026-03-18 16:54:34.955067723 +0000 UTC m=+597.728642293" lastFinishedPulling="2026-03-18 16:54:40.253124427 +0000 UTC m=+603.026698993" observedRunningTime="2026-03-18 16:54:40.81533953 +0000 UTC m=+603.588914111" watchObservedRunningTime="2026-03-18 16:54:40.816986585 +0000 UTC m=+603.590561172" Mar 18 16:54:51.785081 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:51.785044 2536 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-6wzcj"] Mar 18 16:54:51.785477 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:51.785296 2536 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-64c8f475fb-6wzcj" podUID="aecf286a-11a3-4f4d-aa45-54aa5f55f3ce" containerName="limitador" containerID="cri-o://68c4740bc169cba9ceaaf1f0ffd8aa8b640b1860ac692114ed3c9ac3628c233f" gracePeriod=30 Mar 18 16:54:51.787336 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:51.787310 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-6wzcj" Mar 18 16:54:52.719191 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:52.719169 2536 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-6wzcj" Mar 18 16:54:52.782457 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:52.782430 2536 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/aecf286a-11a3-4f4d-aa45-54aa5f55f3ce-config-file\") pod \"aecf286a-11a3-4f4d-aa45-54aa5f55f3ce\" (UID: \"aecf286a-11a3-4f4d-aa45-54aa5f55f3ce\") " Mar 18 16:54:52.782568 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:52.782472 2536 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8fss\" (UniqueName: \"kubernetes.io/projected/aecf286a-11a3-4f4d-aa45-54aa5f55f3ce-kube-api-access-s8fss\") pod \"aecf286a-11a3-4f4d-aa45-54aa5f55f3ce\" (UID: \"aecf286a-11a3-4f4d-aa45-54aa5f55f3ce\") " Mar 18 16:54:52.782836 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:52.782808 2536 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aecf286a-11a3-4f4d-aa45-54aa5f55f3ce-config-file" (OuterVolumeSpecName: "config-file") pod "aecf286a-11a3-4f4d-aa45-54aa5f55f3ce" (UID: "aecf286a-11a3-4f4d-aa45-54aa5f55f3ce"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:54:52.784586 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:52.784563 2536 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aecf286a-11a3-4f4d-aa45-54aa5f55f3ce-kube-api-access-s8fss" (OuterVolumeSpecName: "kube-api-access-s8fss") pod "aecf286a-11a3-4f4d-aa45-54aa5f55f3ce" (UID: "aecf286a-11a3-4f4d-aa45-54aa5f55f3ce"). InnerVolumeSpecName "kube-api-access-s8fss". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:54:52.830876 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:52.830847 2536 generic.go:358] "Generic (PLEG): container finished" podID="aecf286a-11a3-4f4d-aa45-54aa5f55f3ce" containerID="68c4740bc169cba9ceaaf1f0ffd8aa8b640b1860ac692114ed3c9ac3628c233f" exitCode=0 Mar 18 16:54:52.831207 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:52.830907 2536 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-6wzcj" Mar 18 16:54:52.831207 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:52.830916 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-6wzcj" event={"ID":"aecf286a-11a3-4f4d-aa45-54aa5f55f3ce","Type":"ContainerDied","Data":"68c4740bc169cba9ceaaf1f0ffd8aa8b640b1860ac692114ed3c9ac3628c233f"} Mar 18 16:54:52.831207 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:52.830942 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-6wzcj" event={"ID":"aecf286a-11a3-4f4d-aa45-54aa5f55f3ce","Type":"ContainerDied","Data":"d638c7998de03ce76481b849fc6df7cde500d9822868a37a31d515965a22ada3"} Mar 18 16:54:52.831207 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:52.830957 2536 scope.go:117] "RemoveContainer" containerID="68c4740bc169cba9ceaaf1f0ffd8aa8b640b1860ac692114ed3c9ac3628c233f" Mar 18 16:54:52.839128 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:52.839113 2536 scope.go:117] "RemoveContainer" containerID="68c4740bc169cba9ceaaf1f0ffd8aa8b640b1860ac692114ed3c9ac3628c233f" Mar 18 16:54:52.839379 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:54:52.839362 2536 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68c4740bc169cba9ceaaf1f0ffd8aa8b640b1860ac692114ed3c9ac3628c233f\": container with ID starting with 68c4740bc169cba9ceaaf1f0ffd8aa8b640b1860ac692114ed3c9ac3628c233f not found: ID does not exist" containerID="68c4740bc169cba9ceaaf1f0ffd8aa8b640b1860ac692114ed3c9ac3628c233f" Mar 18 16:54:52.839454 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:52.839385 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68c4740bc169cba9ceaaf1f0ffd8aa8b640b1860ac692114ed3c9ac3628c233f"} err="failed to get container status \"68c4740bc169cba9ceaaf1f0ffd8aa8b640b1860ac692114ed3c9ac3628c233f\": rpc error: code = NotFound desc = could not find container \"68c4740bc169cba9ceaaf1f0ffd8aa8b640b1860ac692114ed3c9ac3628c233f\": container with ID starting with 68c4740bc169cba9ceaaf1f0ffd8aa8b640b1860ac692114ed3c9ac3628c233f not found: ID does not exist" Mar 18 16:54:52.858413 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:52.858357 2536 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-6wzcj"] Mar 18 16:54:52.862824 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:52.862804 2536 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-6wzcj"] Mar 18 16:54:52.883898 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:52.883879 2536 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/aecf286a-11a3-4f4d-aa45-54aa5f55f3ce-config-file\") on node \"ip-10-0-132-224.ec2.internal\" DevicePath \"\"" Mar 18 16:54:52.883967 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:52.883900 2536 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s8fss\" (UniqueName: \"kubernetes.io/projected/aecf286a-11a3-4f4d-aa45-54aa5f55f3ce-kube-api-access-s8fss\") on node \"ip-10-0-132-224.ec2.internal\" DevicePath \"\"" Mar 18 16:54:53.909713 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:54:53.909677 2536 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aecf286a-11a3-4f4d-aa45-54aa5f55f3ce" path="/var/lib/kubelet/pods/aecf286a-11a3-4f4d-aa45-54aa5f55f3ce/volumes" Mar 18 16:55:00.589946 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:00.589915 2536 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-68bd676465-jjpnw"] Mar 18 16:55:00.590426 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:00.590234 2536 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aecf286a-11a3-4f4d-aa45-54aa5f55f3ce" containerName="limitador" Mar 18 16:55:00.590426 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:00.590244 2536 state_mem.go:107] "Deleted CPUSet assignment" podUID="aecf286a-11a3-4f4d-aa45-54aa5f55f3ce" containerName="limitador" Mar 18 16:55:00.590426 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:00.590314 2536 memory_manager.go:356] "RemoveStaleState removing state" podUID="aecf286a-11a3-4f4d-aa45-54aa5f55f3ce" containerName="limitador" Mar 18 16:55:00.597567 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:00.597548 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-jjpnw" Mar 18 16:55:00.599886 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:00.599840 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-jjpnw"] Mar 18 16:55:00.600055 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:00.600035 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-mgks8\"" Mar 18 16:55:00.600149 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:00.600047 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Mar 18 16:55:00.754223 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:00.754186 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1b49104f-557f-4edc-a510-e922d1578fb9-tls-cert\") pod \"authorino-68bd676465-jjpnw\" (UID: \"1b49104f-557f-4edc-a510-e922d1578fb9\") " pod="kuadrant-system/authorino-68bd676465-jjpnw" Mar 18 16:55:00.754410 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:00.754235 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9tps\" (UniqueName: \"kubernetes.io/projected/1b49104f-557f-4edc-a510-e922d1578fb9-kube-api-access-s9tps\") pod \"authorino-68bd676465-jjpnw\" (UID: \"1b49104f-557f-4edc-a510-e922d1578fb9\") " pod="kuadrant-system/authorino-68bd676465-jjpnw" Mar 18 16:55:00.855573 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:00.855488 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1b49104f-557f-4edc-a510-e922d1578fb9-tls-cert\") pod \"authorino-68bd676465-jjpnw\" (UID: \"1b49104f-557f-4edc-a510-e922d1578fb9\") " pod="kuadrant-system/authorino-68bd676465-jjpnw" Mar 18 16:55:00.855573 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:00.855537 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9tps\" (UniqueName: \"kubernetes.io/projected/1b49104f-557f-4edc-a510-e922d1578fb9-kube-api-access-s9tps\") pod \"authorino-68bd676465-jjpnw\" (UID: \"1b49104f-557f-4edc-a510-e922d1578fb9\") " pod="kuadrant-system/authorino-68bd676465-jjpnw" Mar 18 16:55:00.858430 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:00.858400 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1b49104f-557f-4edc-a510-e922d1578fb9-tls-cert\") pod \"authorino-68bd676465-jjpnw\" (UID: \"1b49104f-557f-4edc-a510-e922d1578fb9\") " pod="kuadrant-system/authorino-68bd676465-jjpnw" Mar 18 16:55:00.862509 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:00.862485 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9tps\" (UniqueName: \"kubernetes.io/projected/1b49104f-557f-4edc-a510-e922d1578fb9-kube-api-access-s9tps\") pod \"authorino-68bd676465-jjpnw\" (UID: \"1b49104f-557f-4edc-a510-e922d1578fb9\") " pod="kuadrant-system/authorino-68bd676465-jjpnw" Mar 18 16:55:00.908211 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:00.908189 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-jjpnw" Mar 18 16:55:01.027170 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:01.027142 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-jjpnw"] Mar 18 16:55:01.029740 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:55:01.029709 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b49104f_557f_4edc_a510_e922d1578fb9.slice/crio-e236add096ec387ddc5efaf562ce9ef2bd5f935f0fabdf0779d49878946eaa61 WatchSource:0}: Error finding container e236add096ec387ddc5efaf562ce9ef2bd5f935f0fabdf0779d49878946eaa61: Status 404 returned error can't find the container with id e236add096ec387ddc5efaf562ce9ef2bd5f935f0fabdf0779d49878946eaa61 Mar 18 16:55:01.864162 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:01.864111 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-jjpnw" event={"ID":"1b49104f-557f-4edc-a510-e922d1578fb9","Type":"ContainerStarted","Data":"e236add096ec387ddc5efaf562ce9ef2bd5f935f0fabdf0779d49878946eaa61"} Mar 18 16:55:03.872454 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:03.872419 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-jjpnw" event={"ID":"1b49104f-557f-4edc-a510-e922d1578fb9","Type":"ContainerStarted","Data":"5ccd8b14df2d38ed973b625acb990ff1511c38108b60a0162843b532005b5f38"} Mar 18 16:55:03.888759 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:03.888714 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-68bd676465-jjpnw" podStartSLOduration=1.556085741 podStartE2EDuration="3.888700601s" podCreationTimestamp="2026-03-18 16:55:00 +0000 UTC" firstStartedPulling="2026-03-18 16:55:01.031090056 +0000 UTC m=+623.804664623" lastFinishedPulling="2026-03-18 16:55:03.363704914 +0000 UTC m=+626.137279483" observedRunningTime="2026-03-18 16:55:03.887227457 +0000 UTC m=+626.660802043" watchObservedRunningTime="2026-03-18 16:55:03.888700601 +0000 UTC m=+626.662275187" Mar 18 16:55:18.775782 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:18.775750 2536 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-69d7c9bbdc-dq2dc"] Mar 18 16:55:18.798791 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:18.798759 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-69d7c9bbdc-dq2dc"] Mar 18 16:55:18.798930 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:18.798848 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-69d7c9bbdc-dq2dc" Mar 18 16:55:18.800905 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:18.800876 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Mar 18 16:55:18.801403 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:18.801379 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Mar 18 16:55:18.801403 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:18.801381 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-wng6s\"" Mar 18 16:55:18.801660 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:18.801639 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Mar 18 16:55:18.818129 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:18.818109 2536 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-748c497bc-8622b"] Mar 18 16:55:18.821702 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:18.821686 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-748c497bc-8622b" Mar 18 16:55:18.823461 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:18.823443 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Mar 18 16:55:18.823550 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:18.823522 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-hzfrv\"" Mar 18 16:55:18.823893 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:18.823876 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690-cert\") pod \"kserve-controller-manager-69d7c9bbdc-dq2dc\" (UID: \"67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-dq2dc" Mar 18 16:55:18.823969 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:18.823945 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nf4n\" (UniqueName: \"kubernetes.io/projected/67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690-kube-api-access-2nf4n\") pod \"kserve-controller-manager-69d7c9bbdc-dq2dc\" (UID: \"67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-dq2dc" Mar 18 16:55:18.828988 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:18.828968 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-748c497bc-8622b"] Mar 18 16:55:18.924436 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:18.924408 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/facc2261-0bab-4970-a32e-5e7528858ab1-data\") pod \"seaweedfs-748c497bc-8622b\" (UID: \"facc2261-0bab-4970-a32e-5e7528858ab1\") " pod="kserve/seaweedfs-748c497bc-8622b" Mar 18 16:55:18.924590 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:18.924443 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2nf4n\" (UniqueName: \"kubernetes.io/projected/67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690-kube-api-access-2nf4n\") pod \"kserve-controller-manager-69d7c9bbdc-dq2dc\" (UID: \"67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-dq2dc" Mar 18 16:55:18.924590 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:18.924486 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wthlx\" (UniqueName: \"kubernetes.io/projected/facc2261-0bab-4970-a32e-5e7528858ab1-kube-api-access-wthlx\") pod \"seaweedfs-748c497bc-8622b\" (UID: \"facc2261-0bab-4970-a32e-5e7528858ab1\") " pod="kserve/seaweedfs-748c497bc-8622b" Mar 18 16:55:18.924590 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:18.924536 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690-cert\") pod \"kserve-controller-manager-69d7c9bbdc-dq2dc\" (UID: \"67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-dq2dc" Mar 18 16:55:18.924733 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:55:18.924622 2536 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Mar 18 16:55:18.924733 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:55:18.924676 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690-cert podName:67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690 nodeName:}" failed. No retries permitted until 2026-03-18 16:55:19.424660332 +0000 UTC m=+642.198234896 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690-cert") pod "kserve-controller-manager-69d7c9bbdc-dq2dc" (UID: "67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690") : secret "kserve-webhook-server-cert" not found Mar 18 16:55:18.939293 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:18.939250 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nf4n\" (UniqueName: \"kubernetes.io/projected/67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690-kube-api-access-2nf4n\") pod \"kserve-controller-manager-69d7c9bbdc-dq2dc\" (UID: \"67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-dq2dc" Mar 18 16:55:19.025409 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:19.025368 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wthlx\" (UniqueName: \"kubernetes.io/projected/facc2261-0bab-4970-a32e-5e7528858ab1-kube-api-access-wthlx\") pod \"seaweedfs-748c497bc-8622b\" (UID: \"facc2261-0bab-4970-a32e-5e7528858ab1\") " pod="kserve/seaweedfs-748c497bc-8622b" Mar 18 16:55:19.025564 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:19.025473 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/facc2261-0bab-4970-a32e-5e7528858ab1-data\") pod \"seaweedfs-748c497bc-8622b\" (UID: \"facc2261-0bab-4970-a32e-5e7528858ab1\") " pod="kserve/seaweedfs-748c497bc-8622b" Mar 18 16:55:19.025822 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:19.025776 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/facc2261-0bab-4970-a32e-5e7528858ab1-data\") pod \"seaweedfs-748c497bc-8622b\" (UID: \"facc2261-0bab-4970-a32e-5e7528858ab1\") " pod="kserve/seaweedfs-748c497bc-8622b" Mar 18 16:55:19.035559 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:19.035532 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wthlx\" (UniqueName: \"kubernetes.io/projected/facc2261-0bab-4970-a32e-5e7528858ab1-kube-api-access-wthlx\") pod \"seaweedfs-748c497bc-8622b\" (UID: \"facc2261-0bab-4970-a32e-5e7528858ab1\") " pod="kserve/seaweedfs-748c497bc-8622b" Mar 18 16:55:19.130925 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:19.130901 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-748c497bc-8622b" Mar 18 16:55:19.428546 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:19.428465 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690-cert\") pod \"kserve-controller-manager-69d7c9bbdc-dq2dc\" (UID: \"67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-dq2dc" Mar 18 16:55:19.431022 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:19.430982 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690-cert\") pod \"kserve-controller-manager-69d7c9bbdc-dq2dc\" (UID: \"67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-dq2dc" Mar 18 16:55:19.456224 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:19.456203 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-748c497bc-8622b"] Mar 18 16:55:19.458386 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:55:19.458360 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfacc2261_0bab_4970_a32e_5e7528858ab1.slice/crio-096976db6d1090c1e1a9ebc095c079faea016751a57b666abe7ee325d0dc2838 WatchSource:0}: Error finding container 096976db6d1090c1e1a9ebc095c079faea016751a57b666abe7ee325d0dc2838: Status 404 returned error can't find the container with id 096976db6d1090c1e1a9ebc095c079faea016751a57b666abe7ee325d0dc2838 Mar 18 16:55:19.709536 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:19.709513 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-69d7c9bbdc-dq2dc" Mar 18 16:55:19.838643 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:19.838609 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-69d7c9bbdc-dq2dc"] Mar 18 16:55:19.928850 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:19.928813 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-748c497bc-8622b" event={"ID":"facc2261-0bab-4970-a32e-5e7528858ab1","Type":"ContainerStarted","Data":"096976db6d1090c1e1a9ebc095c079faea016751a57b666abe7ee325d0dc2838"} Mar 18 16:55:19.931998 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:55:19.931966 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67d1cd5e_d6a7_4e9e_b4af_8a8ad1516690.slice/crio-e1a2567e8e958a9b77baf2f71b2e7e4409555b434eda0cc74fb96e0f9e30a6f0 WatchSource:0}: Error finding container e1a2567e8e958a9b77baf2f71b2e7e4409555b434eda0cc74fb96e0f9e30a6f0: Status 404 returned error can't find the container with id e1a2567e8e958a9b77baf2f71b2e7e4409555b434eda0cc74fb96e0f9e30a6f0 Mar 18 16:55:20.057671 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:20.057587 2536 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-69d7c9bbdc-dq2dc"] Mar 18 16:55:20.074819 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:20.074787 2536 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-69d7c9bbdc-vsjmt"] Mar 18 16:55:20.078389 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:20.078362 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-69d7c9bbdc-vsjmt" Mar 18 16:55:20.086667 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:20.086644 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-69d7c9bbdc-vsjmt"] Mar 18 16:55:20.135193 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:20.135160 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a0900f5-c051-4ef0-b3db-f7e1bf05aad5-cert\") pod \"kserve-controller-manager-69d7c9bbdc-vsjmt\" (UID: \"8a0900f5-c051-4ef0-b3db-f7e1bf05aad5\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-vsjmt" Mar 18 16:55:20.135376 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:20.135327 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-964tr\" (UniqueName: \"kubernetes.io/projected/8a0900f5-c051-4ef0-b3db-f7e1bf05aad5-kube-api-access-964tr\") pod \"kserve-controller-manager-69d7c9bbdc-vsjmt\" (UID: \"8a0900f5-c051-4ef0-b3db-f7e1bf05aad5\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-vsjmt" Mar 18 16:55:20.236608 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:20.236573 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a0900f5-c051-4ef0-b3db-f7e1bf05aad5-cert\") pod \"kserve-controller-manager-69d7c9bbdc-vsjmt\" (UID: \"8a0900f5-c051-4ef0-b3db-f7e1bf05aad5\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-vsjmt" Mar 18 16:55:20.236762 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:20.236693 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-964tr\" (UniqueName: \"kubernetes.io/projected/8a0900f5-c051-4ef0-b3db-f7e1bf05aad5-kube-api-access-964tr\") pod \"kserve-controller-manager-69d7c9bbdc-vsjmt\" (UID: \"8a0900f5-c051-4ef0-b3db-f7e1bf05aad5\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-vsjmt" Mar 18 16:55:20.239393 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:20.239366 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a0900f5-c051-4ef0-b3db-f7e1bf05aad5-cert\") pod \"kserve-controller-manager-69d7c9bbdc-vsjmt\" (UID: \"8a0900f5-c051-4ef0-b3db-f7e1bf05aad5\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-vsjmt" Mar 18 16:55:20.245160 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:20.245132 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-964tr\" (UniqueName: \"kubernetes.io/projected/8a0900f5-c051-4ef0-b3db-f7e1bf05aad5-kube-api-access-964tr\") pod \"kserve-controller-manager-69d7c9bbdc-vsjmt\" (UID: \"8a0900f5-c051-4ef0-b3db-f7e1bf05aad5\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-vsjmt" Mar 18 16:55:20.392866 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:20.392793 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-69d7c9bbdc-vsjmt" Mar 18 16:55:20.536964 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:20.536926 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-69d7c9bbdc-vsjmt"] Mar 18 16:55:20.540013 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:55:20.539981 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a0900f5_c051_4ef0_b3db_f7e1bf05aad5.slice/crio-81bd3d3dd576ad20eb8f3d9cf414f23df921a0f6f6f8099977fdb4554df26d43 WatchSource:0}: Error finding container 81bd3d3dd576ad20eb8f3d9cf414f23df921a0f6f6f8099977fdb4554df26d43: Status 404 returned error can't find the container with id 81bd3d3dd576ad20eb8f3d9cf414f23df921a0f6f6f8099977fdb4554df26d43 Mar 18 16:55:20.933999 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:20.933942 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-69d7c9bbdc-vsjmt" event={"ID":"8a0900f5-c051-4ef0-b3db-f7e1bf05aad5","Type":"ContainerStarted","Data":"81bd3d3dd576ad20eb8f3d9cf414f23df921a0f6f6f8099977fdb4554df26d43"} Mar 18 16:55:20.935217 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:20.935192 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-69d7c9bbdc-dq2dc" event={"ID":"67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690","Type":"ContainerStarted","Data":"e1a2567e8e958a9b77baf2f71b2e7e4409555b434eda0cc74fb96e0f9e30a6f0"} Mar 18 16:55:23.946742 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:23.946703 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-69d7c9bbdc-dq2dc" event={"ID":"67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690","Type":"ContainerStarted","Data":"ca40cfe43b0995f9ca35c160c54171a997d119f6894a350fcf934904e2e87364"} Mar 18 16:55:23.946742 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:23.946754 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-69d7c9bbdc-dq2dc" Mar 18 16:55:23.947240 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:23.946752 2536 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-69d7c9bbdc-dq2dc" podUID="67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690" containerName="manager" containerID="cri-o://ca40cfe43b0995f9ca35c160c54171a997d119f6894a350fcf934904e2e87364" gracePeriod=10 Mar 18 16:55:23.948067 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:23.948036 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-748c497bc-8622b" event={"ID":"facc2261-0bab-4970-a32e-5e7528858ab1","Type":"ContainerStarted","Data":"199edf81224872ecfada35dadac295dad07f251aed18d44e5b4e80c8075211dc"} Mar 18 16:55:23.949390 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:23.949368 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-69d7c9bbdc-vsjmt" event={"ID":"8a0900f5-c051-4ef0-b3db-f7e1bf05aad5","Type":"ContainerStarted","Data":"30765b6709b92bc41377f900b67cf3885b2ec391bd5e6ae51644155695b1dcba"} Mar 18 16:55:23.949518 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:23.949499 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-69d7c9bbdc-vsjmt" Mar 18 16:55:23.963667 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:23.963628 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-69d7c9bbdc-dq2dc" podStartSLOduration=2.211290941 podStartE2EDuration="5.963618425s" podCreationTimestamp="2026-03-18 16:55:18 +0000 UTC" firstStartedPulling="2026-03-18 16:55:19.934128429 +0000 UTC m=+642.707703006" lastFinishedPulling="2026-03-18 16:55:23.686455917 +0000 UTC m=+646.460030490" observedRunningTime="2026-03-18 16:55:23.961953293 +0000 UTC m=+646.735527890" watchObservedRunningTime="2026-03-18 16:55:23.963618425 +0000 UTC m=+646.737193048" Mar 18 16:55:23.977700 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:23.977661 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-69d7c9bbdc-vsjmt" podStartSLOduration=0.874240398 podStartE2EDuration="3.977651961s" podCreationTimestamp="2026-03-18 16:55:20 +0000 UTC" firstStartedPulling="2026-03-18 16:55:20.542455999 +0000 UTC m=+643.316030581" lastFinishedPulling="2026-03-18 16:55:23.64586758 +0000 UTC m=+646.419442144" observedRunningTime="2026-03-18 16:55:23.976784115 +0000 UTC m=+646.750358697" watchObservedRunningTime="2026-03-18 16:55:23.977651961 +0000 UTC m=+646.751226546" Mar 18 16:55:23.993626 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:23.993587 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-748c497bc-8622b" podStartSLOduration=1.7109019669999999 podStartE2EDuration="5.993573803s" podCreationTimestamp="2026-03-18 16:55:18 +0000 UTC" firstStartedPulling="2026-03-18 16:55:19.459575064 +0000 UTC m=+642.233149631" lastFinishedPulling="2026-03-18 16:55:23.742246904 +0000 UTC m=+646.515821467" observedRunningTime="2026-03-18 16:55:23.99213106 +0000 UTC m=+646.765705660" watchObservedRunningTime="2026-03-18 16:55:23.993573803 +0000 UTC m=+646.767148392" Mar 18 16:55:24.188123 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:24.188102 2536 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-69d7c9bbdc-dq2dc" Mar 18 16:55:24.277098 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:24.277063 2536 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690-cert\") pod \"67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690\" (UID: \"67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690\") " Mar 18 16:55:24.277248 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:24.277140 2536 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nf4n\" (UniqueName: \"kubernetes.io/projected/67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690-kube-api-access-2nf4n\") pod \"67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690\" (UID: \"67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690\") " Mar 18 16:55:24.279222 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:24.279196 2536 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690-cert" (OuterVolumeSpecName: "cert") pod "67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690" (UID: "67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:55:24.279333 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:24.279229 2536 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690-kube-api-access-2nf4n" (OuterVolumeSpecName: "kube-api-access-2nf4n") pod "67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690" (UID: "67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690"). InnerVolumeSpecName "kube-api-access-2nf4n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:55:24.378693 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:24.378626 2536 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690-cert\") on node \"ip-10-0-132-224.ec2.internal\" DevicePath \"\"" Mar 18 16:55:24.378693 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:24.378653 2536 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2nf4n\" (UniqueName: \"kubernetes.io/projected/67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690-kube-api-access-2nf4n\") on node \"ip-10-0-132-224.ec2.internal\" DevicePath \"\"" Mar 18 16:55:24.953876 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:24.953844 2536 generic.go:358] "Generic (PLEG): container finished" podID="67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690" containerID="ca40cfe43b0995f9ca35c160c54171a997d119f6894a350fcf934904e2e87364" exitCode=2 Mar 18 16:55:24.954330 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:24.953911 2536 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-69d7c9bbdc-dq2dc" Mar 18 16:55:24.954330 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:24.953934 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-69d7c9bbdc-dq2dc" event={"ID":"67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690","Type":"ContainerDied","Data":"ca40cfe43b0995f9ca35c160c54171a997d119f6894a350fcf934904e2e87364"} Mar 18 16:55:24.954330 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:24.953976 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-69d7c9bbdc-dq2dc" event={"ID":"67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690","Type":"ContainerDied","Data":"e1a2567e8e958a9b77baf2f71b2e7e4409555b434eda0cc74fb96e0f9e30a6f0"} Mar 18 16:55:24.954330 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:24.953993 2536 scope.go:117] "RemoveContainer" containerID="ca40cfe43b0995f9ca35c160c54171a997d119f6894a350fcf934904e2e87364" Mar 18 16:55:24.962683 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:24.962666 2536 scope.go:117] "RemoveContainer" containerID="ca40cfe43b0995f9ca35c160c54171a997d119f6894a350fcf934904e2e87364" Mar 18 16:55:24.962934 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:55:24.962903 2536 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca40cfe43b0995f9ca35c160c54171a997d119f6894a350fcf934904e2e87364\": container with ID starting with ca40cfe43b0995f9ca35c160c54171a997d119f6894a350fcf934904e2e87364 not found: ID does not exist" containerID="ca40cfe43b0995f9ca35c160c54171a997d119f6894a350fcf934904e2e87364" Mar 18 16:55:24.962997 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:24.962946 2536 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca40cfe43b0995f9ca35c160c54171a997d119f6894a350fcf934904e2e87364"} err="failed to get container status \"ca40cfe43b0995f9ca35c160c54171a997d119f6894a350fcf934904e2e87364\": rpc error: code = NotFound desc = could not find container \"ca40cfe43b0995f9ca35c160c54171a997d119f6894a350fcf934904e2e87364\": container with ID starting with ca40cfe43b0995f9ca35c160c54171a997d119f6894a350fcf934904e2e87364 not found: ID does not exist" Mar 18 16:55:24.972901 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:24.972879 2536 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-69d7c9bbdc-dq2dc"] Mar 18 16:55:24.976773 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:24.976753 2536 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-69d7c9bbdc-dq2dc"] Mar 18 16:55:25.910632 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:25.910595 2536 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690" path="/var/lib/kubelet/pods/67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690/volumes" Mar 18 16:55:54.959912 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:54.959883 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-69d7c9bbdc-vsjmt" Mar 18 16:55:55.866516 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:55.866483 2536 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-9699c8d45-g4wpv"] Mar 18 16:55:55.866873 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:55.866860 2536 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690" containerName="manager" Mar 18 16:55:55.866925 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:55.866876 2536 state_mem.go:107] "Deleted CPUSet assignment" podUID="67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690" containerName="manager" Mar 18 16:55:55.866963 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:55.866929 2536 memory_manager.go:356] "RemoveStaleState removing state" podUID="67d1cd5e-d6a7-4e9e-b4af-8a8ad1516690" containerName="manager" Mar 18 16:55:55.870284 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:55.870251 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-9699c8d45-g4wpv" Mar 18 16:55:55.872321 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:55.872301 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Mar 18 16:55:55.872321 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:55.872315 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-8k9jw\"" Mar 18 16:55:55.878763 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:55.878733 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-9699c8d45-g4wpv"] Mar 18 16:55:55.882054 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:55.882030 2536 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-7ldxj"] Mar 18 16:55:55.886566 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:55.886543 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-7ldxj" Mar 18 16:55:55.889836 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:55.889479 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-ccdv9\"" Mar 18 16:55:55.889836 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:55.889560 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Mar 18 16:55:55.894798 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:55.894762 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-7ldxj"] Mar 18 16:55:55.944484 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:55.944449 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-865qs\" (UniqueName: \"kubernetes.io/projected/890518d1-0467-414b-93e1-25b9f72d6aca-kube-api-access-865qs\") pod \"model-serving-api-9699c8d45-g4wpv\" (UID: \"890518d1-0467-414b-93e1-25b9f72d6aca\") " pod="kserve/model-serving-api-9699c8d45-g4wpv" Mar 18 16:55:55.944617 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:55.944487 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mq6d\" (UniqueName: \"kubernetes.io/projected/d66d4f5b-9ca9-46b1-bfd5-2e7ca71f251c-kube-api-access-8mq6d\") pod \"odh-model-controller-696fc77849-7ldxj\" (UID: \"d66d4f5b-9ca9-46b1-bfd5-2e7ca71f251c\") " pod="kserve/odh-model-controller-696fc77849-7ldxj" Mar 18 16:55:55.944617 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:55.944556 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/890518d1-0467-414b-93e1-25b9f72d6aca-tls-certs\") pod \"model-serving-api-9699c8d45-g4wpv\" (UID: \"890518d1-0467-414b-93e1-25b9f72d6aca\") " pod="kserve/model-serving-api-9699c8d45-g4wpv" Mar 18 16:55:55.944695 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:55.944628 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d66d4f5b-9ca9-46b1-bfd5-2e7ca71f251c-cert\") pod \"odh-model-controller-696fc77849-7ldxj\" (UID: \"d66d4f5b-9ca9-46b1-bfd5-2e7ca71f251c\") " pod="kserve/odh-model-controller-696fc77849-7ldxj" Mar 18 16:55:56.045499 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:56.045449 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d66d4f5b-9ca9-46b1-bfd5-2e7ca71f251c-cert\") pod \"odh-model-controller-696fc77849-7ldxj\" (UID: \"d66d4f5b-9ca9-46b1-bfd5-2e7ca71f251c\") " pod="kserve/odh-model-controller-696fc77849-7ldxj" Mar 18 16:55:56.045499 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:56.045511 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-865qs\" (UniqueName: \"kubernetes.io/projected/890518d1-0467-414b-93e1-25b9f72d6aca-kube-api-access-865qs\") pod \"model-serving-api-9699c8d45-g4wpv\" (UID: \"890518d1-0467-414b-93e1-25b9f72d6aca\") " pod="kserve/model-serving-api-9699c8d45-g4wpv" Mar 18 16:55:56.046031 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:56.045545 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8mq6d\" (UniqueName: \"kubernetes.io/projected/d66d4f5b-9ca9-46b1-bfd5-2e7ca71f251c-kube-api-access-8mq6d\") pod \"odh-model-controller-696fc77849-7ldxj\" (UID: \"d66d4f5b-9ca9-46b1-bfd5-2e7ca71f251c\") " pod="kserve/odh-model-controller-696fc77849-7ldxj" Mar 18 16:55:56.046031 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:56.045582 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/890518d1-0467-414b-93e1-25b9f72d6aca-tls-certs\") pod \"model-serving-api-9699c8d45-g4wpv\" (UID: \"890518d1-0467-414b-93e1-25b9f72d6aca\") " pod="kserve/model-serving-api-9699c8d45-g4wpv" Mar 18 16:55:56.046031 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:55:56.045772 2536 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Mar 18 16:55:56.046031 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:55:56.045846 2536 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/890518d1-0467-414b-93e1-25b9f72d6aca-tls-certs podName:890518d1-0467-414b-93e1-25b9f72d6aca nodeName:}" failed. No retries permitted until 2026-03-18 16:55:56.545823225 +0000 UTC m=+679.319397794 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/890518d1-0467-414b-93e1-25b9f72d6aca-tls-certs") pod "model-serving-api-9699c8d45-g4wpv" (UID: "890518d1-0467-414b-93e1-25b9f72d6aca") : secret "model-serving-api-tls" not found Mar 18 16:55:56.048129 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:56.048107 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d66d4f5b-9ca9-46b1-bfd5-2e7ca71f251c-cert\") pod \"odh-model-controller-696fc77849-7ldxj\" (UID: \"d66d4f5b-9ca9-46b1-bfd5-2e7ca71f251c\") " pod="kserve/odh-model-controller-696fc77849-7ldxj" Mar 18 16:55:56.059054 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:56.059026 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mq6d\" (UniqueName: \"kubernetes.io/projected/d66d4f5b-9ca9-46b1-bfd5-2e7ca71f251c-kube-api-access-8mq6d\") pod \"odh-model-controller-696fc77849-7ldxj\" (UID: \"d66d4f5b-9ca9-46b1-bfd5-2e7ca71f251c\") " pod="kserve/odh-model-controller-696fc77849-7ldxj" Mar 18 16:55:56.059313 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:56.059294 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-865qs\" (UniqueName: \"kubernetes.io/projected/890518d1-0467-414b-93e1-25b9f72d6aca-kube-api-access-865qs\") pod \"model-serving-api-9699c8d45-g4wpv\" (UID: \"890518d1-0467-414b-93e1-25b9f72d6aca\") " pod="kserve/model-serving-api-9699c8d45-g4wpv" Mar 18 16:55:56.199979 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:56.199882 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-7ldxj" Mar 18 16:55:56.319673 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:56.319599 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-7ldxj"] Mar 18 16:55:56.322016 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:55:56.321985 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd66d4f5b_9ca9_46b1_bfd5_2e7ca71f251c.slice/crio-26fddcc9b99329aa764b40a64d064db3791df52cf85a77aa425be365ea6aedc0 WatchSource:0}: Error finding container 26fddcc9b99329aa764b40a64d064db3791df52cf85a77aa425be365ea6aedc0: Status 404 returned error can't find the container with id 26fddcc9b99329aa764b40a64d064db3791df52cf85a77aa425be365ea6aedc0 Mar 18 16:55:56.551438 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:56.551400 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/890518d1-0467-414b-93e1-25b9f72d6aca-tls-certs\") pod \"model-serving-api-9699c8d45-g4wpv\" (UID: \"890518d1-0467-414b-93e1-25b9f72d6aca\") " pod="kserve/model-serving-api-9699c8d45-g4wpv" Mar 18 16:55:56.553850 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:56.553832 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/890518d1-0467-414b-93e1-25b9f72d6aca-tls-certs\") pod \"model-serving-api-9699c8d45-g4wpv\" (UID: \"890518d1-0467-414b-93e1-25b9f72d6aca\") " pod="kserve/model-serving-api-9699c8d45-g4wpv" Mar 18 16:55:56.782237 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:56.782193 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-9699c8d45-g4wpv" Mar 18 16:55:56.920855 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:56.920820 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-9699c8d45-g4wpv"] Mar 18 16:55:56.922983 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:55:56.922955 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod890518d1_0467_414b_93e1_25b9f72d6aca.slice/crio-a0eaea8ec6bf3160d63390056947ef866c2407a99208a42197e63794697f8c67 WatchSource:0}: Error finding container a0eaea8ec6bf3160d63390056947ef866c2407a99208a42197e63794697f8c67: Status 404 returned error can't find the container with id a0eaea8ec6bf3160d63390056947ef866c2407a99208a42197e63794697f8c67 Mar 18 16:55:57.062283 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:57.062226 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-9699c8d45-g4wpv" event={"ID":"890518d1-0467-414b-93e1-25b9f72d6aca","Type":"ContainerStarted","Data":"a0eaea8ec6bf3160d63390056947ef866c2407a99208a42197e63794697f8c67"} Mar 18 16:55:57.064185 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:57.064139 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-7ldxj" event={"ID":"d66d4f5b-9ca9-46b1-bfd5-2e7ca71f251c","Type":"ContainerStarted","Data":"26fddcc9b99329aa764b40a64d064db3791df52cf85a77aa425be365ea6aedc0"} Mar 18 16:55:57.193955 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:55:57.193830 2536 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 16:55:57.194213 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:55:57.194139 2536 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-865qs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000680000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-g4wpv_kserve(890518d1-0467-414b-93e1-25b9f72d6aca): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 16:55:57.195391 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:55:57.195340 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-g4wpv" podUID="890518d1-0467-414b-93e1-25b9f72d6aca" Mar 18 16:55:58.068591 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:55:58.068551 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-g4wpv" podUID="890518d1-0467-414b-93e1-25b9f72d6aca" Mar 18 16:55:59.072469 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:59.072433 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-7ldxj" event={"ID":"d66d4f5b-9ca9-46b1-bfd5-2e7ca71f251c","Type":"ContainerStarted","Data":"309843e6bfcbc2ac568aa55d126897649705de27e7849cc7c042e038ad298e4b"} Mar 18 16:55:59.072833 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:59.072530 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-7ldxj" Mar 18 16:55:59.087927 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:55:59.087878 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-7ldxj" podStartSLOduration=1.552503132 podStartE2EDuration="4.087867495s" podCreationTimestamp="2026-03-18 16:55:55 +0000 UTC" firstStartedPulling="2026-03-18 16:55:56.323320962 +0000 UTC m=+679.096895529" lastFinishedPulling="2026-03-18 16:55:58.858685318 +0000 UTC m=+681.632259892" observedRunningTime="2026-03-18 16:55:59.08737895 +0000 UTC m=+681.860953533" watchObservedRunningTime="2026-03-18 16:55:59.087867495 +0000 UTC m=+681.861442080" Mar 18 16:56:10.077811 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:10.077784 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-7ldxj" Mar 18 16:56:11.193187 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:56:11.193110 2536 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 16:56:11.193567 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:56:11.193305 2536 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-865qs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000680000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-g4wpv_kserve(890518d1-0467-414b-93e1-25b9f72d6aca): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 16:56:11.194502 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:56:11.194472 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-g4wpv" podUID="890518d1-0467-414b-93e1-25b9f72d6aca" Mar 18 16:56:23.907139 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:56:23.907108 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-g4wpv" podUID="890518d1-0467-414b-93e1-25b9f72d6aca" Mar 18 16:56:28.334438 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:28.334396 2536 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-8zg6n"] Mar 18 16:56:28.338676 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:28.338651 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-8zg6n" Mar 18 16:56:28.341051 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:28.341026 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Mar 18 16:56:28.341186 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:28.341073 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Mar 18 16:56:28.341186 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:28.341026 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Mar 18 16:56:28.341356 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:28.341337 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-1-openshift-default-dockercfg-hf4v6\"" Mar 18 16:56:28.348986 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:28.348964 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-8zg6n"] Mar 18 16:56:28.439224 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:28.439193 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/de3fcd99-f2ec-463a-9036-083fd02dc202-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-8zg6n\" (UID: \"de3fcd99-f2ec-463a-9036-083fd02dc202\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-8zg6n" Mar 18 16:56:28.440732 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:28.439247 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/de3fcd99-f2ec-463a-9036-083fd02dc202-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-8zg6n\" (UID: \"de3fcd99-f2ec-463a-9036-083fd02dc202\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-8zg6n" Mar 18 16:56:28.440732 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:28.439326 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/de3fcd99-f2ec-463a-9036-083fd02dc202-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-8zg6n\" (UID: \"de3fcd99-f2ec-463a-9036-083fd02dc202\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-8zg6n" Mar 18 16:56:28.440732 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:28.439358 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pqht\" (UniqueName: \"kubernetes.io/projected/de3fcd99-f2ec-463a-9036-083fd02dc202-kube-api-access-9pqht\") pod \"router-gateway-1-openshift-default-6c59fbf55c-8zg6n\" (UID: \"de3fcd99-f2ec-463a-9036-083fd02dc202\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-8zg6n" Mar 18 16:56:28.440732 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:28.439404 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/de3fcd99-f2ec-463a-9036-083fd02dc202-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-8zg6n\" (UID: \"de3fcd99-f2ec-463a-9036-083fd02dc202\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-8zg6n" Mar 18 16:56:28.440732 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:28.439425 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/de3fcd99-f2ec-463a-9036-083fd02dc202-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-8zg6n\" (UID: \"de3fcd99-f2ec-463a-9036-083fd02dc202\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-8zg6n" Mar 18 16:56:28.440732 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:28.439445 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/de3fcd99-f2ec-463a-9036-083fd02dc202-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-8zg6n\" (UID: \"de3fcd99-f2ec-463a-9036-083fd02dc202\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-8zg6n" Mar 18 16:56:28.440732 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:28.439460 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/de3fcd99-f2ec-463a-9036-083fd02dc202-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-8zg6n\" (UID: \"de3fcd99-f2ec-463a-9036-083fd02dc202\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-8zg6n" Mar 18 16:56:28.440732 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:28.439474 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/de3fcd99-f2ec-463a-9036-083fd02dc202-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-8zg6n\" (UID: \"de3fcd99-f2ec-463a-9036-083fd02dc202\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-8zg6n" Mar 18 16:56:28.540136 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:28.540100 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/de3fcd99-f2ec-463a-9036-083fd02dc202-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-8zg6n\" (UID: \"de3fcd99-f2ec-463a-9036-083fd02dc202\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-8zg6n" Mar 18 16:56:28.540136 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:28.540140 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/de3fcd99-f2ec-463a-9036-083fd02dc202-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-8zg6n\" (UID: \"de3fcd99-f2ec-463a-9036-083fd02dc202\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-8zg6n" Mar 18 16:56:28.540497 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:28.540161 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/de3fcd99-f2ec-463a-9036-083fd02dc202-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-8zg6n\" (UID: \"de3fcd99-f2ec-463a-9036-083fd02dc202\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-8zg6n" Mar 18 16:56:28.540497 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:28.540316 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/de3fcd99-f2ec-463a-9036-083fd02dc202-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-8zg6n\" (UID: \"de3fcd99-f2ec-463a-9036-083fd02dc202\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-8zg6n" Mar 18 16:56:28.540497 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:28.540386 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/de3fcd99-f2ec-463a-9036-083fd02dc202-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-8zg6n\" (UID: \"de3fcd99-f2ec-463a-9036-083fd02dc202\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-8zg6n" Mar 18 16:56:28.540497 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:28.540431 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/de3fcd99-f2ec-463a-9036-083fd02dc202-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-8zg6n\" (UID: \"de3fcd99-f2ec-463a-9036-083fd02dc202\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-8zg6n" Mar 18 16:56:28.540497 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:28.540498 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9pqht\" (UniqueName: \"kubernetes.io/projected/de3fcd99-f2ec-463a-9036-083fd02dc202-kube-api-access-9pqht\") pod \"router-gateway-1-openshift-default-6c59fbf55c-8zg6n\" (UID: \"de3fcd99-f2ec-463a-9036-083fd02dc202\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-8zg6n" Mar 18 16:56:28.540758 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:28.540583 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/de3fcd99-f2ec-463a-9036-083fd02dc202-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-8zg6n\" (UID: \"de3fcd99-f2ec-463a-9036-083fd02dc202\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-8zg6n" Mar 18 16:56:28.540758 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:28.540590 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/de3fcd99-f2ec-463a-9036-083fd02dc202-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-8zg6n\" (UID: \"de3fcd99-f2ec-463a-9036-083fd02dc202\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-8zg6n" Mar 18 16:56:28.540758 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:28.540634 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/de3fcd99-f2ec-463a-9036-083fd02dc202-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-8zg6n\" (UID: \"de3fcd99-f2ec-463a-9036-083fd02dc202\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-8zg6n" Mar 18 16:56:28.540758 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:28.540652 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/de3fcd99-f2ec-463a-9036-083fd02dc202-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-8zg6n\" (UID: \"de3fcd99-f2ec-463a-9036-083fd02dc202\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-8zg6n" Mar 18 16:56:28.540912 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:28.540764 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/de3fcd99-f2ec-463a-9036-083fd02dc202-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-8zg6n\" (UID: \"de3fcd99-f2ec-463a-9036-083fd02dc202\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-8zg6n" Mar 18 16:56:28.540992 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:28.540973 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/de3fcd99-f2ec-463a-9036-083fd02dc202-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-8zg6n\" (UID: \"de3fcd99-f2ec-463a-9036-083fd02dc202\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-8zg6n" Mar 18 16:56:28.541042 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:28.541000 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/de3fcd99-f2ec-463a-9036-083fd02dc202-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-8zg6n\" (UID: \"de3fcd99-f2ec-463a-9036-083fd02dc202\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-8zg6n" Mar 18 16:56:28.542962 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:28.542944 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/de3fcd99-f2ec-463a-9036-083fd02dc202-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-8zg6n\" (UID: \"de3fcd99-f2ec-463a-9036-083fd02dc202\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-8zg6n" Mar 18 16:56:28.543128 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:28.543109 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/de3fcd99-f2ec-463a-9036-083fd02dc202-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-8zg6n\" (UID: \"de3fcd99-f2ec-463a-9036-083fd02dc202\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-8zg6n" Mar 18 16:56:28.547552 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:28.547528 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/de3fcd99-f2ec-463a-9036-083fd02dc202-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-8zg6n\" (UID: \"de3fcd99-f2ec-463a-9036-083fd02dc202\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-8zg6n" Mar 18 16:56:28.547716 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:28.547696 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pqht\" (UniqueName: \"kubernetes.io/projected/de3fcd99-f2ec-463a-9036-083fd02dc202-kube-api-access-9pqht\") pod \"router-gateway-1-openshift-default-6c59fbf55c-8zg6n\" (UID: \"de3fcd99-f2ec-463a-9036-083fd02dc202\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-8zg6n" Mar 18 16:56:28.653517 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:28.653414 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-8zg6n" Mar 18 16:56:28.778334 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:28.778302 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-8zg6n"] Mar 18 16:56:28.781583 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:56:28.781551 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde3fcd99_f2ec_463a_9036_083fd02dc202.slice/crio-98fac2322142635cbc336190a6f18700eb80181df2626a93a4bd1cbd33ee592c WatchSource:0}: Error finding container 98fac2322142635cbc336190a6f18700eb80181df2626a93a4bd1cbd33ee592c: Status 404 returned error can't find the container with id 98fac2322142635cbc336190a6f18700eb80181df2626a93a4bd1cbd33ee592c Mar 18 16:56:28.783831 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:28.783781 2536 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Mar 18 16:56:28.783912 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:28.783876 2536 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Mar 18 16:56:28.783956 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:28.783916 2536 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Mar 18 16:56:29.179812 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:29.179778 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-8zg6n" event={"ID":"de3fcd99-f2ec-463a-9036-083fd02dc202","Type":"ContainerStarted","Data":"73c96e8320164dfe986522eee34793a731c37cd97da0d5b9b817e34c5d9330d7"} Mar 18 16:56:29.179812 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:29.179815 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-8zg6n" event={"ID":"de3fcd99-f2ec-463a-9036-083fd02dc202","Type":"ContainerStarted","Data":"98fac2322142635cbc336190a6f18700eb80181df2626a93a4bd1cbd33ee592c"} Mar 18 16:56:29.198975 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:29.198878 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-8zg6n" podStartSLOduration=1.198859445 podStartE2EDuration="1.198859445s" podCreationTimestamp="2026-03-18 16:56:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:56:29.198360586 +0000 UTC m=+711.971935172" watchObservedRunningTime="2026-03-18 16:56:29.198859445 +0000 UTC m=+711.972434032" Mar 18 16:56:29.654612 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:29.654576 2536 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-8zg6n" Mar 18 16:56:29.659607 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:29.659583 2536 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-8zg6n" Mar 18 16:56:30.183417 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:30.183389 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-8zg6n" Mar 18 16:56:30.184293 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:56:30.184259 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-8zg6n" Mar 18 16:56:38.170159 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:56:38.170050 2536 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 16:56:38.170733 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:56:38.170294 2536 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-865qs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000680000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-g4wpv_kserve(890518d1-0467-414b-93e1-25b9f72d6aca): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 16:56:38.171951 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:56:38.171920 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-g4wpv" podUID="890518d1-0467-414b-93e1-25b9f72d6aca" Mar 18 16:56:50.906961 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:56:50.906918 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-g4wpv" podUID="890518d1-0467-414b-93e1-25b9f72d6aca" Mar 18 16:57:04.906669 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:57:04.906636 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-g4wpv" podUID="890518d1-0467-414b-93e1-25b9f72d6aca" Mar 18 16:57:20.206985 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:57:20.206949 2536 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 16:57:20.207343 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:57:20.207115 2536 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-865qs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000680000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-g4wpv_kserve(890518d1-0467-414b-93e1-25b9f72d6aca): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 16:57:20.208285 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:57:20.208249 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-g4wpv" podUID="890518d1-0467-414b-93e1-25b9f72d6aca" Mar 18 16:57:24.799335 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:24.799298 2536 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-7mqqw"] Mar 18 16:57:24.802836 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:24.802815 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-7mqqw" Mar 18 16:57:24.805553 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:24.805531 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-2-openshift-default-dockercfg-plzzk\"" Mar 18 16:57:24.821658 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:24.821628 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-7mqqw"] Mar 18 16:57:24.839499 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:24.839462 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/00095ada-7c54-4ff0-98c6-ff2d9671f369-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-7mqqw\" (UID: \"00095ada-7c54-4ff0-98c6-ff2d9671f369\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-7mqqw" Mar 18 16:57:24.839752 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:24.839728 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/00095ada-7c54-4ff0-98c6-ff2d9671f369-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-7mqqw\" (UID: \"00095ada-7c54-4ff0-98c6-ff2d9671f369\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-7mqqw" Mar 18 16:57:24.839869 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:24.839852 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7p2t\" (UniqueName: \"kubernetes.io/projected/00095ada-7c54-4ff0-98c6-ff2d9671f369-kube-api-access-c7p2t\") pod \"router-gateway-2-openshift-default-6866b85949-7mqqw\" (UID: \"00095ada-7c54-4ff0-98c6-ff2d9671f369\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-7mqqw" Mar 18 16:57:24.839994 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:24.839980 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/00095ada-7c54-4ff0-98c6-ff2d9671f369-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-7mqqw\" (UID: \"00095ada-7c54-4ff0-98c6-ff2d9671f369\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-7mqqw" Mar 18 16:57:24.840200 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:24.840179 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/00095ada-7c54-4ff0-98c6-ff2d9671f369-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-7mqqw\" (UID: \"00095ada-7c54-4ff0-98c6-ff2d9671f369\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-7mqqw" Mar 18 16:57:24.840384 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:24.840366 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/00095ada-7c54-4ff0-98c6-ff2d9671f369-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-7mqqw\" (UID: \"00095ada-7c54-4ff0-98c6-ff2d9671f369\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-7mqqw" Mar 18 16:57:24.840500 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:24.840485 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/00095ada-7c54-4ff0-98c6-ff2d9671f369-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-7mqqw\" (UID: \"00095ada-7c54-4ff0-98c6-ff2d9671f369\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-7mqqw" Mar 18 16:57:24.840613 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:24.840600 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/00095ada-7c54-4ff0-98c6-ff2d9671f369-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-7mqqw\" (UID: \"00095ada-7c54-4ff0-98c6-ff2d9671f369\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-7mqqw" Mar 18 16:57:24.840734 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:24.840699 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/00095ada-7c54-4ff0-98c6-ff2d9671f369-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-7mqqw\" (UID: \"00095ada-7c54-4ff0-98c6-ff2d9671f369\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-7mqqw" Mar 18 16:57:24.941512 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:24.941476 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/00095ada-7c54-4ff0-98c6-ff2d9671f369-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-7mqqw\" (UID: \"00095ada-7c54-4ff0-98c6-ff2d9671f369\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-7mqqw" Mar 18 16:57:24.941512 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:24.941518 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/00095ada-7c54-4ff0-98c6-ff2d9671f369-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-7mqqw\" (UID: \"00095ada-7c54-4ff0-98c6-ff2d9671f369\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-7mqqw" Mar 18 16:57:24.941735 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:24.941541 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/00095ada-7c54-4ff0-98c6-ff2d9671f369-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-7mqqw\" (UID: \"00095ada-7c54-4ff0-98c6-ff2d9671f369\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-7mqqw" Mar 18 16:57:24.941735 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:24.941560 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/00095ada-7c54-4ff0-98c6-ff2d9671f369-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-7mqqw\" (UID: \"00095ada-7c54-4ff0-98c6-ff2d9671f369\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-7mqqw" Mar 18 16:57:24.941808 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:24.941752 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/00095ada-7c54-4ff0-98c6-ff2d9671f369-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-7mqqw\" (UID: \"00095ada-7c54-4ff0-98c6-ff2d9671f369\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-7mqqw" Mar 18 16:57:24.941808 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:24.941798 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/00095ada-7c54-4ff0-98c6-ff2d9671f369-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-7mqqw\" (UID: \"00095ada-7c54-4ff0-98c6-ff2d9671f369\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-7mqqw" Mar 18 16:57:24.941882 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:24.941830 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c7p2t\" (UniqueName: \"kubernetes.io/projected/00095ada-7c54-4ff0-98c6-ff2d9671f369-kube-api-access-c7p2t\") pod \"router-gateway-2-openshift-default-6866b85949-7mqqw\" (UID: \"00095ada-7c54-4ff0-98c6-ff2d9671f369\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-7mqqw" Mar 18 16:57:24.941882 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:24.941871 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/00095ada-7c54-4ff0-98c6-ff2d9671f369-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-7mqqw\" (UID: \"00095ada-7c54-4ff0-98c6-ff2d9671f369\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-7mqqw" Mar 18 16:57:24.941983 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:24.941959 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/00095ada-7c54-4ff0-98c6-ff2d9671f369-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-7mqqw\" (UID: \"00095ada-7c54-4ff0-98c6-ff2d9671f369\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-7mqqw" Mar 18 16:57:24.942037 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:24.941990 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/00095ada-7c54-4ff0-98c6-ff2d9671f369-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-7mqqw\" (UID: \"00095ada-7c54-4ff0-98c6-ff2d9671f369\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-7mqqw" Mar 18 16:57:24.942037 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:24.941997 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/00095ada-7c54-4ff0-98c6-ff2d9671f369-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-7mqqw\" (UID: \"00095ada-7c54-4ff0-98c6-ff2d9671f369\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-7mqqw" Mar 18 16:57:24.942037 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:24.942018 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/00095ada-7c54-4ff0-98c6-ff2d9671f369-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-7mqqw\" (UID: \"00095ada-7c54-4ff0-98c6-ff2d9671f369\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-7mqqw" Mar 18 16:57:24.942180 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:24.942071 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/00095ada-7c54-4ff0-98c6-ff2d9671f369-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-7mqqw\" (UID: \"00095ada-7c54-4ff0-98c6-ff2d9671f369\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-7mqqw" Mar 18 16:57:24.942752 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:24.942727 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/00095ada-7c54-4ff0-98c6-ff2d9671f369-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-7mqqw\" (UID: \"00095ada-7c54-4ff0-98c6-ff2d9671f369\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-7mqqw" Mar 18 16:57:24.944127 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:24.944097 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/00095ada-7c54-4ff0-98c6-ff2d9671f369-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-7mqqw\" (UID: \"00095ada-7c54-4ff0-98c6-ff2d9671f369\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-7mqqw" Mar 18 16:57:24.944408 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:24.944391 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/00095ada-7c54-4ff0-98c6-ff2d9671f369-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-7mqqw\" (UID: \"00095ada-7c54-4ff0-98c6-ff2d9671f369\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-7mqqw" Mar 18 16:57:24.948448 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:24.948426 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/00095ada-7c54-4ff0-98c6-ff2d9671f369-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-7mqqw\" (UID: \"00095ada-7c54-4ff0-98c6-ff2d9671f369\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-7mqqw" Mar 18 16:57:24.948668 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:24.948647 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7p2t\" (UniqueName: \"kubernetes.io/projected/00095ada-7c54-4ff0-98c6-ff2d9671f369-kube-api-access-c7p2t\") pod \"router-gateway-2-openshift-default-6866b85949-7mqqw\" (UID: \"00095ada-7c54-4ff0-98c6-ff2d9671f369\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-7mqqw" Mar 18 16:57:25.113258 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:25.113156 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-7mqqw" Mar 18 16:57:25.287959 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:25.287933 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-7mqqw"] Mar 18 16:57:25.290123 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:57:25.290090 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00095ada_7c54_4ff0_98c6_ff2d9671f369.slice/crio-95ba87cd2990377508704075f12adc5b0e15f6b3d397094dafc8076153ca545c WatchSource:0}: Error finding container 95ba87cd2990377508704075f12adc5b0e15f6b3d397094dafc8076153ca545c: Status 404 returned error can't find the container with id 95ba87cd2990377508704075f12adc5b0e15f6b3d397094dafc8076153ca545c Mar 18 16:57:25.292309 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:25.292234 2536 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Mar 18 16:57:25.292410 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:25.292348 2536 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Mar 18 16:57:25.292410 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:25.292379 2536 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Mar 18 16:57:25.371424 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:25.371397 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-7mqqw" event={"ID":"00095ada-7c54-4ff0-98c6-ff2d9671f369","Type":"ContainerStarted","Data":"95ba87cd2990377508704075f12adc5b0e15f6b3d397094dafc8076153ca545c"} Mar 18 16:57:26.376539 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:26.376500 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-7mqqw" event={"ID":"00095ada-7c54-4ff0-98c6-ff2d9671f369","Type":"ContainerStarted","Data":"9eb02ba66ac96c66c55a310f554e8b052bb1aa39da37dc8fc0da1b0437002f69"} Mar 18 16:57:26.394729 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:26.394674 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-7mqqw" podStartSLOduration=2.3946430850000002 podStartE2EDuration="2.394643085s" podCreationTimestamp="2026-03-18 16:57:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:57:26.393692571 +0000 UTC m=+769.167267149" watchObservedRunningTime="2026-03-18 16:57:26.394643085 +0000 UTC m=+769.168217668" Mar 18 16:57:27.113946 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:27.113905 2536 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-7mqqw" Mar 18 16:57:27.119058 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:27.119032 2536 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-7mqqw" Mar 18 16:57:27.380278 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:27.380174 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-7mqqw" Mar 18 16:57:27.381346 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:27.381323 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-7mqqw" Mar 18 16:57:32.907027 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:57:32.906993 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-g4wpv" podUID="890518d1-0467-414b-93e1-25b9f72d6aca" Mar 18 16:57:40.291576 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:40.291548 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-8zg6n_de3fcd99-f2ec-463a-9036-083fd02dc202/istio-proxy/0.log" Mar 18 16:57:40.319428 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:40.319400 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-7mqqw_00095ada-7c54-4ff0-98c6-ff2d9671f369/istio-proxy/0.log" Mar 18 16:57:40.871208 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:40.871174 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-8zg6n_de3fcd99-f2ec-463a-9036-083fd02dc202/istio-proxy/0.log" Mar 18 16:57:40.886106 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:40.886079 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-7mqqw_00095ada-7c54-4ff0-98c6-ff2d9671f369/istio-proxy/0.log" Mar 18 16:57:41.432842 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:41.432811 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-8zg6n_de3fcd99-f2ec-463a-9036-083fd02dc202/istio-proxy/0.log" Mar 18 16:57:41.448086 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:41.448057 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-7mqqw_00095ada-7c54-4ff0-98c6-ff2d9671f369/istio-proxy/0.log" Mar 18 16:57:41.968868 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:41.968824 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-8zg6n_de3fcd99-f2ec-463a-9036-083fd02dc202/istio-proxy/0.log" Mar 18 16:57:41.982528 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:41.982504 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-7mqqw_00095ada-7c54-4ff0-98c6-ff2d9671f369/istio-proxy/0.log" Mar 18 16:57:42.512922 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:42.512893 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-8zg6n_de3fcd99-f2ec-463a-9036-083fd02dc202/istio-proxy/0.log" Mar 18 16:57:42.527341 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:42.527321 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-7mqqw_00095ada-7c54-4ff0-98c6-ff2d9671f369/istio-proxy/0.log" Mar 18 16:57:43.058717 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:43.058688 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-8zg6n_de3fcd99-f2ec-463a-9036-083fd02dc202/istio-proxy/0.log" Mar 18 16:57:43.072876 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:43.072855 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-7mqqw_00095ada-7c54-4ff0-98c6-ff2d9671f369/istio-proxy/0.log" Mar 18 16:57:43.602098 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:43.602070 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-8zg6n_de3fcd99-f2ec-463a-9036-083fd02dc202/istio-proxy/0.log" Mar 18 16:57:43.618278 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:43.618251 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-7mqqw_00095ada-7c54-4ff0-98c6-ff2d9671f369/istio-proxy/0.log" Mar 18 16:57:44.144285 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:44.144244 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-8zg6n_de3fcd99-f2ec-463a-9036-083fd02dc202/istio-proxy/0.log" Mar 18 16:57:44.160376 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:44.160349 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-7mqqw_00095ada-7c54-4ff0-98c6-ff2d9671f369/istio-proxy/0.log" Mar 18 16:57:44.692045 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:44.692016 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-8zg6n_de3fcd99-f2ec-463a-9036-083fd02dc202/istio-proxy/0.log" Mar 18 16:57:44.706819 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:44.706794 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-7mqqw_00095ada-7c54-4ff0-98c6-ff2d9671f369/istio-proxy/0.log" Mar 18 16:57:45.235365 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:45.235338 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-8zg6n_de3fcd99-f2ec-463a-9036-083fd02dc202/istio-proxy/0.log" Mar 18 16:57:45.249985 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:45.249960 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-7mqqw_00095ada-7c54-4ff0-98c6-ff2d9671f369/istio-proxy/0.log" Mar 18 16:57:45.776523 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:45.776491 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-8zg6n_de3fcd99-f2ec-463a-9036-083fd02dc202/istio-proxy/0.log" Mar 18 16:57:45.793475 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:45.793450 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-7mqqw_00095ada-7c54-4ff0-98c6-ff2d9671f369/istio-proxy/0.log" Mar 18 16:57:46.333102 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:46.333069 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-8zg6n_de3fcd99-f2ec-463a-9036-083fd02dc202/istio-proxy/0.log" Mar 18 16:57:46.346978 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:46.346958 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-7mqqw_00095ada-7c54-4ff0-98c6-ff2d9671f369/istio-proxy/0.log" Mar 18 16:57:46.870202 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:46.870160 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-8zg6n_de3fcd99-f2ec-463a-9036-083fd02dc202/istio-proxy/0.log" Mar 18 16:57:46.884392 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:46.884372 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-7mqqw_00095ada-7c54-4ff0-98c6-ff2d9671f369/istio-proxy/0.log" Mar 18 16:57:47.439499 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:47.439471 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-8zg6n_de3fcd99-f2ec-463a-9036-083fd02dc202/istio-proxy/0.log" Mar 18 16:57:47.458529 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:47.458503 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-7mqqw_00095ada-7c54-4ff0-98c6-ff2d9671f369/istio-proxy/0.log" Mar 18 16:57:47.908823 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:57:47.908796 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-g4wpv" podUID="890518d1-0467-414b-93e1-25b9f72d6aca" Mar 18 16:57:48.052859 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:48.052833 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-gxrtl_3c55b02a-defe-494b-9c4b-3312f62ed11b/istio-proxy/0.log" Mar 18 16:57:48.071881 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:48.071862 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-559fbc86fb-nz727_2aea10b4-ea4d-46ca-a2c5-159a563fc276/router/0.log" Mar 18 16:57:48.696219 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:48.696194 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-8zg6n_de3fcd99-f2ec-463a-9036-083fd02dc202/istio-proxy/0.log" Mar 18 16:57:48.710597 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:48.710577 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-7mqqw_00095ada-7c54-4ff0-98c6-ff2d9671f369/istio-proxy/0.log" Mar 18 16:57:49.308525 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:49.308497 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-8zg6n_de3fcd99-f2ec-463a-9036-083fd02dc202/istio-proxy/0.log" Mar 18 16:57:49.323188 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:49.323161 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-7mqqw_00095ada-7c54-4ff0-98c6-ff2d9671f369/istio-proxy/0.log" Mar 18 16:57:49.923823 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:49.923790 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-gxrtl_3c55b02a-defe-494b-9c4b-3312f62ed11b/istio-proxy/0.log" Mar 18 16:57:49.942008 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:49.941975 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-559fbc86fb-nz727_2aea10b4-ea4d-46ca-a2c5-159a563fc276/router/0.log" Mar 18 16:57:50.590837 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:50.590810 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-jjpnw_1b49104f-557f-4edc-a510-e922d1578fb9/authorino/0.log" Mar 18 16:57:50.620498 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:50.620462 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-5zl8n_4ac0b078-2731-455d-b061-ed53f3521660/manager/0.log" Mar 18 16:57:50.632123 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:50.632100 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-4699c_3f985d57-b8fc-45bb-9d34-5007209116e4/kuadrant-console-plugin/0.log" Mar 18 16:57:51.357553 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:51.357525 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-jjpnw_1b49104f-557f-4edc-a510-e922d1578fb9/authorino/0.log" Mar 18 16:57:51.392780 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:51.392752 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-5zl8n_4ac0b078-2731-455d-b061-ed53f3521660/manager/0.log" Mar 18 16:57:51.405929 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:51.405906 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-4699c_3f985d57-b8fc-45bb-9d34-5007209116e4/kuadrant-console-plugin/0.log" Mar 18 16:57:52.124115 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:52.124090 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-jjpnw_1b49104f-557f-4edc-a510-e922d1578fb9/authorino/0.log" Mar 18 16:57:52.152784 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:52.152757 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-5zl8n_4ac0b078-2731-455d-b061-ed53f3521660/manager/0.log" Mar 18 16:57:52.164106 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:52.164088 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-4699c_3f985d57-b8fc-45bb-9d34-5007209116e4/kuadrant-console-plugin/0.log" Mar 18 16:57:52.843316 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:52.843287 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-jjpnw_1b49104f-557f-4edc-a510-e922d1578fb9/authorino/0.log" Mar 18 16:57:52.872868 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:52.872842 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-5zl8n_4ac0b078-2731-455d-b061-ed53f3521660/manager/0.log" Mar 18 16:57:52.885007 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:52.884982 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-4699c_3f985d57-b8fc-45bb-9d34-5007209116e4/kuadrant-console-plugin/0.log" Mar 18 16:57:53.573601 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:53.573575 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-jjpnw_1b49104f-557f-4edc-a510-e922d1578fb9/authorino/0.log" Mar 18 16:57:53.602202 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:53.602174 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-5zl8n_4ac0b078-2731-455d-b061-ed53f3521660/manager/0.log" Mar 18 16:57:53.614011 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:53.613989 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-4699c_3f985d57-b8fc-45bb-9d34-5007209116e4/kuadrant-console-plugin/0.log" Mar 18 16:57:58.467194 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:58.467167 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-f52ql_b998ab02-161a-40e9-9e53-7183a98152de/global-pull-secret-syncer/0.log" Mar 18 16:57:58.581154 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:58.581122 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-pj5bn_c87e1483-58f2-438b-af86-c607ffcbf01c/konnectivity-agent/0.log" Mar 18 16:57:58.654310 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:57:58.654260 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-132-224.ec2.internal_7a6f0eddc0950f127c5c3c1035cc71a3/haproxy/0.log" Mar 18 16:58:02.753764 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:02.753727 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-jjpnw_1b49104f-557f-4edc-a510-e922d1578fb9/authorino/0.log" Mar 18 16:58:02.820953 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:02.820927 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-5zl8n_4ac0b078-2731-455d-b061-ed53f3521660/manager/0.log" Mar 18 16:58:02.845318 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:02.845259 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-4699c_3f985d57-b8fc-45bb-9d34-5007209116e4/kuadrant-console-plugin/0.log" Mar 18 16:58:02.907129 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:58:02.907103 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-g4wpv" podUID="890518d1-0467-414b-93e1-25b9f72d6aca" Mar 18 16:58:03.874148 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:03.874077 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a130b1b3-5902-463d-ac4c-4f7aaa59cb2f/alertmanager/0.log" Mar 18 16:58:03.897431 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:03.897407 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a130b1b3-5902-463d-ac4c-4f7aaa59cb2f/config-reloader/0.log" Mar 18 16:58:03.919665 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:03.919649 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a130b1b3-5902-463d-ac4c-4f7aaa59cb2f/kube-rbac-proxy-web/0.log" Mar 18 16:58:03.945623 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:03.945607 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a130b1b3-5902-463d-ac4c-4f7aaa59cb2f/kube-rbac-proxy/0.log" Mar 18 16:58:03.968819 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:03.968796 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a130b1b3-5902-463d-ac4c-4f7aaa59cb2f/kube-rbac-proxy-metric/0.log" Mar 18 16:58:03.991339 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:03.991320 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a130b1b3-5902-463d-ac4c-4f7aaa59cb2f/prom-label-proxy/0.log" Mar 18 16:58:04.012763 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:04.012742 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a130b1b3-5902-463d-ac4c-4f7aaa59cb2f/init-config-reloader/0.log" Mar 18 16:58:04.189016 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:04.188986 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-6df7999c47-v5fw4_25bd2c32-7449-4f75-ab0e-7b815e14c3ca/kube-state-metrics/0.log" Mar 18 16:58:04.218094 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:04.218075 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-6df7999c47-v5fw4_25bd2c32-7449-4f75-ab0e-7b815e14c3ca/kube-rbac-proxy-main/0.log" Mar 18 16:58:04.239033 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:04.239009 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-6df7999c47-v5fw4_25bd2c32-7449-4f75-ab0e-7b815e14c3ca/kube-rbac-proxy-self/0.log" Mar 18 16:58:04.337372 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:04.337344 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dqszd_4d7aaf65-f284-4670-8d82-f69adb1a0774/node-exporter/0.log" Mar 18 16:58:04.363054 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:04.363033 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dqszd_4d7aaf65-f284-4670-8d82-f69adb1a0774/kube-rbac-proxy/0.log" Mar 18 16:58:04.384211 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:04.384184 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dqszd_4d7aaf65-f284-4670-8d82-f69adb1a0774/init-textfile/0.log" Mar 18 16:58:04.580482 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:04.580457 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-68b5d5d464-qmgzp_556af7ea-9e15-4fb2-b326-af5947c1713a/kube-rbac-proxy-main/0.log" Mar 18 16:58:04.608043 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:04.608016 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-68b5d5d464-qmgzp_556af7ea-9e15-4fb2-b326-af5947c1713a/kube-rbac-proxy-self/0.log" Mar 18 16:58:04.629048 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:04.629030 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-68b5d5d464-qmgzp_556af7ea-9e15-4fb2-b326-af5947c1713a/openshift-state-metrics/0.log" Mar 18 16:58:04.691373 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:04.691293 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a30a0e81-c0ff-4186-be97-f99f05750aad/prometheus/0.log" Mar 18 16:58:04.710795 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:04.710772 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a30a0e81-c0ff-4186-be97-f99f05750aad/config-reloader/0.log" Mar 18 16:58:04.731672 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:04.731651 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a30a0e81-c0ff-4186-be97-f99f05750aad/thanos-sidecar/0.log" Mar 18 16:58:04.760647 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:04.760629 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a30a0e81-c0ff-4186-be97-f99f05750aad/kube-rbac-proxy-web/0.log" Mar 18 16:58:04.781882 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:04.781863 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a30a0e81-c0ff-4186-be97-f99f05750aad/kube-rbac-proxy/0.log" Mar 18 16:58:04.807410 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:04.807393 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a30a0e81-c0ff-4186-be97-f99f05750aad/kube-rbac-proxy-thanos/0.log" Mar 18 16:58:04.841474 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:04.841457 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a30a0e81-c0ff-4186-be97-f99f05750aad/init-config-reloader/0.log" Mar 18 16:58:06.320008 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:06.319977 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-55b77584bb-j76sf_c059e498-a993-46a1-8de3-8ae39045a1e7/networking-console-plugin/0.log" Mar 18 16:58:06.849484 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:06.849455 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-mx9sz_95e73a3f-8a85-403f-b00b-17524a80b500/console-operator/2.log" Mar 18 16:58:06.853235 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:06.853214 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-mx9sz_95e73a3f-8a85-403f-b00b-17524a80b500/console-operator/3.log" Mar 18 16:58:07.342698 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:07.342667 2536 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zsmk9/perf-node-gather-daemonset-l96wm"] Mar 18 16:58:07.348438 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:07.348411 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-l96wm" Mar 18 16:58:07.351898 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:07.351869 2536 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-zsmk9\"/\"default-dockercfg-vmgwf\"" Mar 18 16:58:07.352380 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:07.352359 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zsmk9\"/\"openshift-service-ca.crt\"" Mar 18 16:58:07.352592 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:07.352570 2536 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zsmk9\"/\"kube-root-ca.crt\"" Mar 18 16:58:07.353357 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:07.353322 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zsmk9/perf-node-gather-daemonset-l96wm"] Mar 18 16:58:07.519382 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:07.519345 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7b35ca8c-1eac-47d8-9f2a-c2eb8a1c2d98-sys\") pod \"perf-node-gather-daemonset-l96wm\" (UID: \"7b35ca8c-1eac-47d8-9f2a-c2eb8a1c2d98\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-l96wm" Mar 18 16:58:07.519572 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:07.519386 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7b35ca8c-1eac-47d8-9f2a-c2eb8a1c2d98-lib-modules\") pod \"perf-node-gather-daemonset-l96wm\" (UID: \"7b35ca8c-1eac-47d8-9f2a-c2eb8a1c2d98\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-l96wm" Mar 18 16:58:07.519572 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:07.519519 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bhtv\" (UniqueName: \"kubernetes.io/projected/7b35ca8c-1eac-47d8-9f2a-c2eb8a1c2d98-kube-api-access-2bhtv\") pod \"perf-node-gather-daemonset-l96wm\" (UID: \"7b35ca8c-1eac-47d8-9f2a-c2eb8a1c2d98\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-l96wm" Mar 18 16:58:07.519572 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:07.519549 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7b35ca8c-1eac-47d8-9f2a-c2eb8a1c2d98-proc\") pod \"perf-node-gather-daemonset-l96wm\" (UID: \"7b35ca8c-1eac-47d8-9f2a-c2eb8a1c2d98\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-l96wm" Mar 18 16:58:07.519713 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:07.519627 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7b35ca8c-1eac-47d8-9f2a-c2eb8a1c2d98-podres\") pod \"perf-node-gather-daemonset-l96wm\" (UID: \"7b35ca8c-1eac-47d8-9f2a-c2eb8a1c2d98\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-l96wm" Mar 18 16:58:07.620294 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:07.620184 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bhtv\" (UniqueName: \"kubernetes.io/projected/7b35ca8c-1eac-47d8-9f2a-c2eb8a1c2d98-kube-api-access-2bhtv\") pod \"perf-node-gather-daemonset-l96wm\" (UID: \"7b35ca8c-1eac-47d8-9f2a-c2eb8a1c2d98\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-l96wm" Mar 18 16:58:07.620294 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:07.620232 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7b35ca8c-1eac-47d8-9f2a-c2eb8a1c2d98-proc\") pod \"perf-node-gather-daemonset-l96wm\" (UID: \"7b35ca8c-1eac-47d8-9f2a-c2eb8a1c2d98\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-l96wm" Mar 18 16:58:07.620294 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:07.620288 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7b35ca8c-1eac-47d8-9f2a-c2eb8a1c2d98-podres\") pod \"perf-node-gather-daemonset-l96wm\" (UID: \"7b35ca8c-1eac-47d8-9f2a-c2eb8a1c2d98\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-l96wm" Mar 18 16:58:07.620569 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:07.620322 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7b35ca8c-1eac-47d8-9f2a-c2eb8a1c2d98-sys\") pod \"perf-node-gather-daemonset-l96wm\" (UID: \"7b35ca8c-1eac-47d8-9f2a-c2eb8a1c2d98\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-l96wm" Mar 18 16:58:07.620569 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:07.620354 2536 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7b35ca8c-1eac-47d8-9f2a-c2eb8a1c2d98-lib-modules\") pod \"perf-node-gather-daemonset-l96wm\" (UID: \"7b35ca8c-1eac-47d8-9f2a-c2eb8a1c2d98\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-l96wm" Mar 18 16:58:07.620569 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:07.620357 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7b35ca8c-1eac-47d8-9f2a-c2eb8a1c2d98-proc\") pod \"perf-node-gather-daemonset-l96wm\" (UID: \"7b35ca8c-1eac-47d8-9f2a-c2eb8a1c2d98\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-l96wm" Mar 18 16:58:07.620569 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:07.620423 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7b35ca8c-1eac-47d8-9f2a-c2eb8a1c2d98-sys\") pod \"perf-node-gather-daemonset-l96wm\" (UID: \"7b35ca8c-1eac-47d8-9f2a-c2eb8a1c2d98\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-l96wm" Mar 18 16:58:07.620569 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:07.620462 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7b35ca8c-1eac-47d8-9f2a-c2eb8a1c2d98-lib-modules\") pod \"perf-node-gather-daemonset-l96wm\" (UID: \"7b35ca8c-1eac-47d8-9f2a-c2eb8a1c2d98\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-l96wm" Mar 18 16:58:07.620569 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:07.620460 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7b35ca8c-1eac-47d8-9f2a-c2eb8a1c2d98-podres\") pod \"perf-node-gather-daemonset-l96wm\" (UID: \"7b35ca8c-1eac-47d8-9f2a-c2eb8a1c2d98\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-l96wm" Mar 18 16:58:07.628340 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:07.628317 2536 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bhtv\" (UniqueName: \"kubernetes.io/projected/7b35ca8c-1eac-47d8-9f2a-c2eb8a1c2d98-kube-api-access-2bhtv\") pod \"perf-node-gather-daemonset-l96wm\" (UID: \"7b35ca8c-1eac-47d8-9f2a-c2eb8a1c2d98\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-l96wm" Mar 18 16:58:07.661641 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:07.661614 2536 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-l96wm" Mar 18 16:58:07.788296 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:07.788256 2536 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zsmk9/perf-node-gather-daemonset-l96wm"] Mar 18 16:58:07.790756 ip-10-0-132-224 kubenswrapper[2536]: W0318 16:58:07.790722 2536 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7b35ca8c_1eac_47d8_9f2a_c2eb8a1c2d98.slice/crio-983b495ac8cca932447be730728bb0eb9c44f0aaa825d9d71d320dd2e2a5f6ea WatchSource:0}: Error finding container 983b495ac8cca932447be730728bb0eb9c44f0aaa825d9d71d320dd2e2a5f6ea: Status 404 returned error can't find the container with id 983b495ac8cca932447be730728bb0eb9c44f0aaa825d9d71d320dd2e2a5f6ea Mar 18 16:58:08.520349 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:08.520308 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-l96wm" event={"ID":"7b35ca8c-1eac-47d8-9f2a-c2eb8a1c2d98","Type":"ContainerStarted","Data":"ee06bfeffdab122a52fb4cffdb66724c7c345c0c031c026728d9cd58cccadc71"} Mar 18 16:58:08.520349 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:08.520348 2536 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-l96wm" event={"ID":"7b35ca8c-1eac-47d8-9f2a-c2eb8a1c2d98","Type":"ContainerStarted","Data":"983b495ac8cca932447be730728bb0eb9c44f0aaa825d9d71d320dd2e2a5f6ea"} Mar 18 16:58:08.520896 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:08.520508 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-l96wm" Mar 18 16:58:08.533196 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:08.533173 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-l45t6_c40b2ed4-792d-4afc-bc1a-3aad44ac26e0/dns/0.log" Mar 18 16:58:08.536841 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:08.536799 2536 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-l96wm" podStartSLOduration=1.5367858559999998 podStartE2EDuration="1.536785856s" podCreationTimestamp="2026-03-18 16:58:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:58:08.535833465 +0000 UTC m=+811.309408065" watchObservedRunningTime="2026-03-18 16:58:08.536785856 +0000 UTC m=+811.310360442" Mar 18 16:58:08.553697 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:08.553675 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-l45t6_c40b2ed4-792d-4afc-bc1a-3aad44ac26e0/kube-rbac-proxy/0.log" Mar 18 16:58:08.655218 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:08.655197 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9fw2f_37693aba-cef0-4f5a-a523-bd82dbff0143/dns-node-resolver/0.log" Mar 18 16:58:09.116010 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:09.115978 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-8598bb85b4-b5rkb_e5f11bd2-354b-4f12-b187-4eef4f830794/registry/0.log" Mar 18 16:58:09.159008 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:09.158981 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jnvc6_a421b67e-f253-48e4-b5a3-4a895c3bf6d2/node-ca/0.log" Mar 18 16:58:10.025881 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:10.025842 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-gxrtl_3c55b02a-defe-494b-9c4b-3312f62ed11b/istio-proxy/0.log" Mar 18 16:58:10.050659 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:10.050617 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-559fbc86fb-nz727_2aea10b4-ea4d-46ca-a2c5-159a563fc276/router/0.log" Mar 18 16:58:10.539932 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:10.539899 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-x862v_0f569b4c-304c-4347-827a-116204073ddf/serve-healthcheck-canary/0.log" Mar 18 16:58:11.133685 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:11.133659 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-s7d2l_9da589e5-8a15-4bfd-8947-4f0291b208b2/kube-rbac-proxy/0.log" Mar 18 16:58:11.158780 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:11.158759 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-s7d2l_9da589e5-8a15-4bfd-8947-4f0291b208b2/exporter/0.log" Mar 18 16:58:11.180039 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:11.179999 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-s7d2l_9da589e5-8a15-4bfd-8947-4f0291b208b2/extractor/0.log" Mar 18 16:58:13.625882 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:13.625854 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-84978b767b-9bt9l_d80b0f51-0292-4738-a17d-217b8bebbe6f/manager/0.log" Mar 18 16:58:14.214307 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:14.214279 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-69d7c9bbdc-vsjmt_8a0900f5-c051-4ef0-b3db-f7e1bf05aad5/manager/0.log" Mar 18 16:58:14.286636 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:14.286608 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-7ldxj_d66d4f5b-9ca9-46b1-bfd5-2e7ca71f251c/manager/0.log" Mar 18 16:58:14.333480 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:14.333443 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-748c497bc-8622b_facc2261-0bab-4970-a32e-5e7528858ab1/seaweedfs/0.log" Mar 18 16:58:14.533572 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:14.533504 2536 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-l96wm" Mar 18 16:58:17.911355 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:17.911247 2536 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:58:17.911570 ip-10-0-132-224 kubenswrapper[2536]: E0318 16:58:17.911455 2536 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-g4wpv" podUID="890518d1-0467-414b-93e1-25b9f72d6aca" Mar 18 16:58:20.360501 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:20.360471 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-74479_c68d2b91-1efd-47b8-93dc-98606a96920b/kube-multus/0.log" Mar 18 16:58:20.739584 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:20.739555 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d7xhk_96282abf-ce09-4b33-baaf-73f9c5329541/kube-multus-additional-cni-plugins/0.log" Mar 18 16:58:20.760927 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:20.760908 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d7xhk_96282abf-ce09-4b33-baaf-73f9c5329541/egress-router-binary-copy/0.log" Mar 18 16:58:20.782530 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:20.782513 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d7xhk_96282abf-ce09-4b33-baaf-73f9c5329541/cni-plugins/0.log" Mar 18 16:58:20.804087 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:20.804068 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d7xhk_96282abf-ce09-4b33-baaf-73f9c5329541/bond-cni-plugin/0.log" Mar 18 16:58:20.827601 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:20.827583 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d7xhk_96282abf-ce09-4b33-baaf-73f9c5329541/routeoverride-cni/0.log" Mar 18 16:58:20.849792 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:20.849773 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d7xhk_96282abf-ce09-4b33-baaf-73f9c5329541/whereabouts-cni-bincopy/0.log" Mar 18 16:58:20.870604 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:20.870584 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d7xhk_96282abf-ce09-4b33-baaf-73f9c5329541/whereabouts-cni/0.log" Mar 18 16:58:20.984460 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:20.984431 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-rjx6m_275a2fa6-277f-40dc-a2bc-749a97550e2e/network-metrics-daemon/0.log" Mar 18 16:58:21.003321 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:21.003243 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-rjx6m_275a2fa6-277f-40dc-a2bc-749a97550e2e/kube-rbac-proxy/0.log" Mar 18 16:58:22.496822 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:22.496779 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ts4mw_3e6fe5a1-3b13-4d51-a141-1280e1b25b3a/ovn-controller/0.log" Mar 18 16:58:22.514580 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:22.514556 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ts4mw_3e6fe5a1-3b13-4d51-a141-1280e1b25b3a/ovn-acl-logging/0.log" Mar 18 16:58:22.519349 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:22.519324 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ts4mw_3e6fe5a1-3b13-4d51-a141-1280e1b25b3a/ovn-acl-logging/1.log" Mar 18 16:58:22.538233 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:22.538214 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ts4mw_3e6fe5a1-3b13-4d51-a141-1280e1b25b3a/kube-rbac-proxy-node/0.log" Mar 18 16:58:22.558071 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:22.558022 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ts4mw_3e6fe5a1-3b13-4d51-a141-1280e1b25b3a/kube-rbac-proxy-ovn-metrics/0.log" Mar 18 16:58:22.576352 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:22.576333 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ts4mw_3e6fe5a1-3b13-4d51-a141-1280e1b25b3a/northd/0.log" Mar 18 16:58:22.596329 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:22.596286 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ts4mw_3e6fe5a1-3b13-4d51-a141-1280e1b25b3a/nbdb/0.log" Mar 18 16:58:22.617964 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:22.617947 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ts4mw_3e6fe5a1-3b13-4d51-a141-1280e1b25b3a/sbdb/0.log" Mar 18 16:58:22.730893 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:22.730870 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ts4mw_3e6fe5a1-3b13-4d51-a141-1280e1b25b3a/ovnkube-controller/0.log" Mar 18 16:58:23.821053 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:23.821016 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-cc88fdd44-2r9d4_cfac9d9b-1748-485d-8093-c404ac8d2d3d/check-endpoints/0.log" Mar 18 16:58:23.899253 ip-10-0-132-224 kubenswrapper[2536]: I0318 16:58:23.899224 2536 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-cr96r_440da786-0ff1-4727-bd36-e32a3acc5a3c/network-check-target-container/0.log"